971 resultados para MONTE-CARLO SIMULATION


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuzzy data has grown to be an important factor in data mining. Whenever uncertainty exists, simulation can be used as a model. Simulation is very flexible, although it can involve significant levels of computation. This article discusses fuzzy decision-making using the grey related analysis method. Fuzzy models are expected to better reflect decision-making uncertainty, at some cost in accuracy relative to crisp models. Monte Carlo simulation is used to incorporate experimental levels of uncertainty into the data and to measure the impact of fuzzy decision tree models using categorical data. Results are compared with decision tree models based on crisp continuous data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markov chain Monte Carlo (MCMC) is a methodology that is gaining widespread use in the phylogenetics community and is central to phylogenetic software packages such as MrBayes. An important issue for users of MCMC methods is how to select appropriate values for adjustable parameters such as the length of the Markov chain or chains, the sampling density, the proposal mechanism, and, if Metropolis-coupled MCMC is being used, the number of heated chains and their temperatures. Although some parameter settings have been examined in detail in the literature, others are frequently chosen with more regard to computational time or personal experience with other data sets. Such choices may lead to inadequate sampling of tree space or an inefficient use of computational resources. We performed a detailed study of convergence and mixing for 70 randomly selected, putatively orthologous protein sets with different sizes and taxonomic compositions. Replicated runs from multiple random starting points permit a more rigorous assessment of convergence, and we developed two novel statistics, delta and epsilon, for this purpose. Although likelihood values invariably stabilized quickly, adequate sampling of the posterior distribution of tree topologies took considerably longer. Our results suggest that multimodality is common for data sets with 30 or more taxa and that this results in slow convergence and mixing. However, we also found that the pragmatic approach of combining data from several short, replicated runs into a metachain to estimate bipartition posterior probabilities provided good approximations, and that such estimates were no worse in approximating a reference posterior distribution than those obtained using a single long run of the same length as the metachain. Precision appears to be best when heated Markov chains have low temperatures, whereas chains with high temperatures appear to sample trees with high posterior probabilities only rarely. [Bayesian phylogenetic inference; heating parameter; Markov chain Monte Carlo; replicated chains.]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GCMC simulations are applied to the adsorption of sub-critical methanol and ethanol on graphitized carbon black at 300 K. The carbon black was modelled both with and without carbonyl functional groups. Large differences are seen between the amounts adsorbed for different carbonyl configurations at low pressure prior to monolayer coverage. Once a monolayer has been formed on the carbon black, the adsorption behaviour is similar between the model surfaces with and without functional groups. Simulation isotherms for the case of low carbonyl concentrations or no carbonyls are qualitatively similar to the few experimental isotherms available in the literature for methanol and ethanol adsorption on highly graphitized carbon black. Isosteric heats and adsorbed phase heat capacities are shown to be very sensitive to carbonyl configurations. A maximum is observed in the adsorbed phase heat capacity of the alcohols for all simulations but is unrealistically high for the case of a plain graphite surface. The addition of carbonyls to the surface greatly reduces this maximum and approaches experimental data with carbonyl concentration as low as 0.09 carbonyls/nm(2).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The adsorption of Lennard-Jones fluids (argon and nitrogen) onto a graphitized thermal carbon black surface was studied with a Grand Canonical Monte Carlo Simulation (GCMC). The surface was assumed to be finite in length and composed of three graphene layers. When the GCMC simulation was used to describe adsorption on a graphite surface, an over-prediction of the isotherm was consistently observed in the pressure regions where the first and second layers are formed. To remove this over-prediction, surface mediation was accounted for to reduce the fluid-fluid interaction. Do and co-workers have introduced the so-called surface-mediation damping factor to correct the over-prediction for the case of a graphite surface of infinite extent, and this approach has yielded a good description of the adsorption isotherm. In this paper, the effects of the finite size of the graphene layer on the adsorption isotherm and how these would affect the extent of the surface mediation were studied. It was found that this finite-surface model provides a better description of the experimental data for graphitized thermal carbon black of high surface area (i.e. small crystallite size) while the infinite- surface model describes data for carbon black of very low surface area (i.e. large crystallite size).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several procedures for calculating the heat of adsorption from Monte Carlo simulations for a heterogeneous adsorbent are presented. Simulations have been performed to generate isotherms for nitrogen at 77 K and methane at 273.15 K in graphitic slit pores of various widths. The procedures were then applied to calculate the heat of adsorption of an activated carbon with an arbitrary pore size distribution. The consistency of the different procedures shows them to be correct in calculating interaction energy contributions to the heat of adsorption. The currently favored procedure for this type of calculation, from the literature, is shown to be incorrect and in serious error when calculating the heat of adsorption of activated carbon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To identify an appropriate dosage strategy for patients receiving enoxaparin by continuous intravenous infusion (CII). Methods: Monte Carlo simulations were performed in NONMEM, (200 replicates of 1000 patients) to predict steady state anti-Xa concentrations (Css) for patients receiving a CII of enoxaparin. The covariate distribution model was simulated based on covariate demographics in the CII study population. The impact of patient weight, renal function (creatinine clearance (CrCL)) and patient location (intensive care unit (ICU)) were evaluated. A population pharmacokinetic model was used as the input-output model (1-compartment first order output model with mixed residual error structure). Success of a dosing regimen was based on the percent of Css that is between the therapeutic range of 0.5 IU/ml to 1.2 IU/ml. Results: The best dose for patients in the ICU was 4.2IU/kg/h (success mean 64.8% and 90% prediction interval (PI): 60.1–69.8%) if CrCL60ml/min, the best dose was 8.3IU/kg/h (success mean 65.4%, 90% PI: 58.5–73.2%). Simulations suggest that there was a 50% improvement in the success of the CII if the dose rate for ICU patients with CrCL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An inherent weakness in the management of large scale projects is the failure to achieve the scheduled completion date. When projects are planned with the objective of time achievement, the initial planning plays a vital role in the successful achievement of project deadlines. Cost and quality are additional priorities when such projects are being executed. This article proposes a methodology for achieving time duration of a project through risk analysis with the application of a Monte Carlo simulation technique. The methodology is demonstrated using a case application of a cross-country petroleum pipeline construction project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent work we have developed a novel variational inference method for partially observed systems governed by stochastic differential equations. In this paper we provide a comparison of the Variational Gaussian Process Smoother with an exact solution computed using a Hybrid Monte Carlo approach to path sampling, applied to a stochastic double well potential model. It is demonstrated that the variational smoother provides us a very accurate estimate of mean path while conditional variance is slightly underestimated. We conclude with some remarks as to the advantages and disadvantages of the variational smoother. © 2008 Springer Science + Business Media LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop set of novel Markov chain Monte Carlo algorithms for Bayesian smoothing of partially observed non-linear diffusion processes. The sampling algorithms developed herein use a deterministic approximation to the posterior distribution over paths as the proposal distribution for a mixture of an independence and a random walk sampler. The approximating distribution is sampled by simulating an optimized time-dependent linear diffusion process derived from the recently developed variational Gaussian process approximation method. Flexible blocking strategies are introduced to further improve mixing, and thus the efficiency, of the sampling algorithms. The algorithms are tested on two diffusion processes: one with double-well potential drift and another with SINE drift. The new algorithm's accuracy and efficiency is compared with state-of-the-art hybrid Monte Carlo based path sampling. It is shown that in practical, finite sample, applications the algorithm is accurate except in the presence of large observation errors and low observation densities, which lead to a multi-modal structure in the posterior distribution over paths. More importantly, the variational approximation assisted sampling algorithm outperforms hybrid Monte Carlo in terms of computational efficiency, except when the diffusion process is densely observed with small errors in which case both algorithms are equally efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

∗This research, which was funded by a grant from the Natural Sciences and Engineering Research Council of Canada, formed part of G.A.’s Ph.D. thesis [1].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure for calculating critical level and power of likelihood ratio test, based on a Monte-Carlo simulation method is proposed. General principles of software building for its realization are given. Some examples of its application are shown.