962 resultados para MONTE-CARLO SIMULATION


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marketing scholars are increasingly recognizing the importance of investigating phenomena at multiple levels. However, the analyses methods that are currently dominant within marketing may not be appropriate to dealing with multilevel or nested data structures. We identify the state of contemporary multilevel marketing research, finding that typical empirical approaches within marketing research may be less effective at explicitly taking account of multilevel data structures than those in other organizational disciplines. A Monte Carlo simulation, based on results from a previously published marketing study, demonstrates that different approaches to analysis of the same data can result in very different results (both in terms of power and effect size). The implication is that marketing scholars should be cautious when analyzing multilevel or other grouped data, and we provide a discussion and introduction to the use of hierarchical linear modeling for this purpose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual detection performance (d') is usually an accelerating function of stimulus contrast, which could imply a smooth, threshold-like nonlinearity in the sensory response. Alternatively, Pelli (1985 Journal of the Optical Society of America A 2 1508 - 1532) developed the 'uncertainty model' in which responses were linear with contrast, but the observer was uncertain about which of many noisy channels contained the signal. Such internal uncertainty effectively adds noise to weak signals, and predicts the nonlinear psychometric function. We re-examined these ideas by plotting psychometric functions (as z-scores) for two observers (SAW, PRM) with high precision. The task was to detect a single, vertical, blurred line at the fixation point, or identify its polarity (light vs dark). Detection of a known polarity was nearly linear for SAW but very nonlinear for PRM. Randomly interleaving light and dark trials reduced performance and rendered it non-linear for SAW, but had little effect for PRM. This occurred for both single-interval and 2AFC procedures. The whole pattern of results was well predicted by our Monte Carlo simulation of Pelli's model, with only two free parameters. SAW (highly practised) had very low uncertainty. PRM (with little prior practice) had much greater uncertainty, resulting in lower contrast sensitivity, nonlinear performance, and no effect of external (polarity) uncertainty. For SAW, identification was about v2 better than detection, implying statistically independent channels for stimuli of opposite polarity, rather than an opponent (light - dark) channel. These findings strongly suggest that noise and uncertainty, rather than sensory nonlinearity, limit visual detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fault tree analysis is used as a tool within hazard and operability (Hazop) studies. The present study proposes a new methodology for obtaining the exact TOP event probability of coherent fault trees. The technique uses a top-down approach similar to that of FATRAM. This new Fault Tree Disjoint Reduction Algorithm resolves all the intermediate events in the tree except OR gates with basic event inputs so that a near minimal cut sets expression is obtained. Then Bennetts' disjoint technique is applied and remaining OR gates are resolved. The technique has been found to be appropriate as an alternative to Monte Carlo simulation methods when rare events are countered and exact results are needed. The algorithm has been developed in FORTRAN 77 on the Perq workstation as an addition to the Aston Hazop package. The Perq graphical environment enabled a friendly user interface to be created. The total package takes as its input cause and symptom equations using Lihou's form of coding and produces both drawings of fault trees and the Boolean sum of products expression into which reliability data can be substituted directly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have developed a new technique for extracting histological parameters from multi-spectral images of the ocular fundus. The new method uses a Monte Carlo simulation of the reflectance of the fundus to model how the spectral reflectance of the tissue varies with differing tissue histology. The model is parameterised by the concentrations of the five main absorbers found in the fundus: retinal haemoglobins, choroidal haemoglobins, choroidal melanin, RPE melanin and macular pigment. These parameters are shown to give rise to distinct variations in the tissue colouration. We use the results of the Monte Carlo simulations to construct an inverse model which maps tissue colouration onto the model parameters. This allows the concentration and distribution of the five main absorbers to be determined from suitable multi-spectral images. We propose the use of "image quotients" to allow this information to be extracted from uncalibrated image data. The filters used to acquire the images are selected to ensure a one-to-one mapping between model parameters and image quotients. To recover five model parameters uniquely, images must be acquired in six distinct spectral bands. Theoretical investigations suggest that retinal haemoglobins and macular pigment can be recovered with RMS errors of less than 10%. We present parametric maps showing the variation of these parameters across the posterior pole of the fundus. The results are in agreement with known tissue histology for normal healthy subjects. We also present an early result which suggests that, with further development, the technique could be used to successfully detect retinal haemorrhages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - To generate a reflectance model of the fundus that allows an accurate non-invasive quantification of blood and pigments. Methods - A Monte Carlo simulation was used to produce a mathematical model of light interaction with the fundus at different wavelengths. The model predictions were compared with fundus images from normal volunteers in several spectral bands (peaks at 507, 525, 552, 585, 596 and 611nm). Th e model was then used to calculate the concentration and distribution of the known absorbing components of the fundus. Results - The shape of the statistical distribution of the image data generally corresponded to that of the model data; the model however appears to overestimate the reflectance of the fundus in the longer wavelength region.As the absorption by xanthophyll has no significant eff ect on light transport above 534nm, its distribution in the fundus was quantified: the wavelengths where both shape and distribution of image and model data matched (<553nm) were used to train a neural network which was then applied to every point in the image data. The xanthophyll distribution thus found was in agreement with published literature data in normal subjects. Conclusion - We have developed a method for optimising multi-spectral imaging of the fundus and a computer image analysis capable of estimating information about the structure and properties of the fundus. Th e technique successfully calculates the distribution of xanthophyll in the fundus of healthy volunteers. Further improvement of the model is required to allow the deduction of other parameters from images; investigations in known pathology models are also necessary to establish if this method is of clinical use in detecting early chroido-retinopathies, hence providing a useful screening and diagnostic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Петър Господинов, Добри Данков, Владимир Русинов, Стефан Стефанов - Изследвано е цилиндрично течение на Кует за разреден газ между два въртящи се цилиндъра. Получени са профилите на налягането, скоростта и температурата по метода на прякото статистическо моделиране (DSMC) и чрез числено решаване на уравненията на Навие-Стокс за свиваем флуид. Резултатите сочат много добро съвпадение за малки числа на Кнудсен Kn = 0.02. Показано е, че при различни кинематични гранични условия, газът изостава или избързва спрямо скоростта на стената, или има поведение на твърдо еластично тяло. Получените резултати са важни при решаването на неравнинни, задачи от микрофлуидиката с отчитане на ефектите на кривината.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows how the angular uncertainties can be determined for a rotary-laser automatic theodolite of the type used in (indoor-GPS) iGPS networks. Initially, the fundamental physics of the rotating head device is used to propagate uncertainties using Monte Carlo simulation. This theoretical element of the study shows how the angular uncertainty is affected by internal parameters, the actual values of which are estimated. Experiments are then carried out to determine the actual uncertainty in the azimuth angle. Results are presented that show that uncertainty decreases with sampling duration. Other significant findings are that uncertainty is relatively constant throughout the working volume and that the uncertainty value is not dependent on the size of the reference angle. © 2009 IMechE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To investigate the use of MRIA for quantitative characterisation of subretinal fibrosis secondary to nAMD. Methods: MRIA images of the posterior pole were acquired over 4 months from 20 eyes including those with inactive subretinal fibrosis and those being treated with ranibizumab for nAMD. Changes in morphology of the macula affected by nAMD were modelled and reflectance spectra at the MRIA acquisition wavelengths (507, 525, 552, 585, 596, 611 and 650nm) were computed using Monte Carlo simulation. Quantitative indicators of fibrosis were derived by matching image spectra to the model spectra of known morphological properties. Results: The model spectra were comparable to the image spectra, both normal and pathological. The key morphological changes that the model associated with nAMD were gliosis of the IS-OS junction, decrease in retinal blood and decrease in RPE melanin. However, these changes were not specific to fibrosis and none of the quantitative indicators showed a unique association with the degree of fibrosis. Moderate correlations were found with the clinical assessment, but not with the treatment program. Conclusion: MRIA can distinguish subretinal fibrosis from healthy tissue. The methods used show high sensitivity but low specificity, being unable to distinguish scarring from other abnormalities like atrophy. Quantification of scarring was not achieved with the wavelengths used due to the complex structural changes to retinal tissues in the process of nAMD. Further studies, incorporating other wavelengths, will establish whether MRIA has a role in the assessment of subretinal fibrosis in the context of retinal and choroidal pathology

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Probability density function (pdf) for sum of n correlated lognormal variables is deducted as a special convolution integral. Pdf for weighted sums (where weights can be any real numbers) is also presented. The result for four dimensions was checked by Monte Carlo simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the case study is to express the delayed repair time impact on the revenues and profit in numbers with the example of the outage of power plant units. Main steps of risk assessment: • creating project plan suitable for risk assessment • identification of the risk factors for each project activities • scenario-analysis based evaluation of risk factors • selection of the critical risk factors based on the results of quantitative risk analysis • formulating risk response actions for the critical risks • running Monte-Carlo simulation [1] using the results of scenario-analysis • building up a macro which creates the connection among the results of the risk assessment, the production plan and the business plan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A felelős vállalatirányítás egyik stratégiai jelentőségű tényezője a vállalati szintű kockázatkezelés, mely napjaink egyik legnagyobb kihívást jelentő területe a vállalatvezetés számára. A hatékony vállalati kockázatkezelés nem valósulhat meg kizárólag az általános, nemzetközi és hazai szakirodalomban megfogalmazott kockázatkezelési alapelvek követése mentén, a kockázatkezelési rendszer kialakítása során figyelembe kell venni mind az iparági, mind az adott vállalatra jellemző sajátosságokat. Mindez különösen fontos egy olyan speciális tevékenységet folytató vállalatnál, mint a villamosenergia-ipari átviteli rendszerirányító társaság (transmission system operator, TSO). A cikkben a magyar villamosenergia-ipari átviteli rendszerirányító társasággal együttműködésben készített kutatás keretében előálló olyan komplex elméleti és gyakorlati keretrendszert mutatnak be a szerzők, mely alapján az átviteli rendszerirányító társaság számára kialakítottak egy új, területenként egységes kockázatkezelési módszertant (fókuszban a kockázatok azonosításának és számszerűsítésének módszertani lépéseivel), mely alkalmas a vállalati szintű kockázati kitettség meghatározására. _______ This study handles one of today’s most challenging areas of enterprise management: the development and introduction of an integrated and efficient risk management system. For companies operating in specific network industries with a dominant market share and a key role in the national economy, such as electricity TSO’s, risk management is of stressed importance. The study introduces an innovative, mathematically and statistically grounded as well as economically reasoned management approach for the identification, individual effect calculation and summation of risk factors. Every building block is customized for the organizational structure and operating environment of the TSO. While the identification phase guarantees all-inclusivity, the calculation phase incorporates expert techniques and Monte Carlo simulation and the summation phase presents an expected combined distribution and value effect of risks on the company’s profit lines based on the previously undiscovered correlations between individual risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrophobicity as measured by Log P is an important molecular property related to toxicity and carcinogenicity. With increasing public health concerns for the effects of Disinfection By-Products (DBPs), there are considerable benefits in developing Quantitative Structure and Activity Relationship (QSAR) models capable of accurately predicting Log P. In this research, Log P values of 173 DBP compounds in 6 functional classes were used to develop QSAR models, by applying 3 molecular descriptors, namely, Energy of the Lowest Unoccupied Molecular Orbital (ELUMO), Number of Chlorine (NCl) and Number of Carbon (NC) by Multiple Linear Regression (MLR) analysis. The QSAR models developed were validated based on the Organization for Economic Co-operation and Development (OECD) principles. The model Applicability Domain (AD) and mechanistic interpretation were explored. Considering the very complex nature of DBPs, the established QSAR models performed very well with respect to goodness-of-fit, robustness and predictability. The predicted values of Log P of DBPs by the QSAR models were found to be significant with a correlation coefficient R2 from 81% to 98%. The Leverage Approach by Williams Plot was applied to detect and remove outliers, consequently increasing R 2 by approximately 2% to 13% for different DBP classes. The developed QSAR models were statistically validated for their predictive power by the Leave-One-Out (LOO) and Leave-Many-Out (LMO) cross validation methods. Finally, Monte Carlo simulation was used to assess the variations and inherent uncertainties in the QSAR models of Log P and determine the most influential parameters in connection with Log P prediction. The developed QSAR models in this dissertation will have a broad applicability domain because the research data set covered six out of eight common DBP classes, including halogenated alkane, halogenated alkene, halogenated aromatic, halogenated aldehyde, halogenated ketone, and halogenated carboxylic acid, which have been brought to the attention of regulatory agencies in recent years. Furthermore, the QSAR models are suitable to be used for prediction of similar DBP compounds within the same applicability domain. The selection and integration of various methodologies developed in this research may also benefit future research in similar fields.