976 resultados para Metodo de Monte Carlo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo simulations with realistic interaction potentials have been carried out on isopentane to investigate the glass transition. Intermolecular pair-correlation functions of the glass show distinct differences from those of the liquid, the CH-CH pair-correlation function being uniquely different from the other pair-correlation functions. The coordination number of the glass is higher than that of the liquid, and the packing in the glass seems to be mainly governed by the geometrical constraints of the molecule. Annealing affects the properties of the glass significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

State and parameter estimations of non-linear dynamical systems, based on incomplete and noisy measurements, are considered using Monte Carlo simulations. Given the measurements. the proposed method obtains the marginalized posterior distribution of an appropriately chosen (ideally small) subset of the state vector using a particle filter. Samples (particles) of the marginalized states are then used to construct a family of conditionally linearized system of equations and thus obtain the posterior distribution of the states using a bank of Kalman filters. Discrete process equations for the marginalized states are derived through truncated Ito-Taylor expansions. Increased analyticity and reduced dispersion of weights computed over a smaller sample space of marginalized states are the key features of the filter that help achieve smaller sample variance of the estimates. Numerical illustrations are provided for state/parameter estimations of a Duffing oscillator and a 3-DOF non-linear oscillator. Performance of the filter in parameter estimation is also assessed using measurements obtained through experiments on simple models in the laboratory. Despite an added computational cost, the results verify that the proposed filter generally produces estimates with lower sample variance over the standard sequential importance sampling (SIS) filter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo simulations of a binary alloy with impurity concentrations between 20 and 45 at.% have been carried out. The proportion of large clusters relative to that of small clusters increases with the number of MC diffusion steps as well as impurity concentration. Magnetic susceptibility peaks become more prominent and occur at higher temperatures with increasing impurity concentration. The different peaks in the susceptibility and specific heat curves seem to correspond to different sized clusters. A freezing model would explain the observed behaviour with the large clusters freezing first and the small clusters contributing to susceptibility (specific heat) peaks at lower temperatures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Monte Carlo study along with experimental uptake measurements of 1,2,3-trimethyl benzene, 1,2,4-trimethyl benzene and 1,3,5-trimethyl benzene (TMB) in beta zeolite is reported. The TraPPE potential has been employed for hydrocarbon interaction and harmonic potential of Demontis for modeling framework of the zeolite. Structure, energetics and dynamics of TMB in zeolite beta from Monte Carlo runs reveal interesting information about the diameter, properties of these isomers on confinement. Of the three isomers, 135TMB is supposed to have the largest diameter. It is seen TraPPE with Demontis potential predicts a restricted motion of 135TMB in the channels of zeolite beta.Experimentally, 135TMB has the highest transport diffusivity whereas MID results suggest this has the lowest self diffusivity. (C) 2009 Elsevier Inc. Ail rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use Bayesian model selection techniques to test extensions of the standard flat LambdaCDM paradigm. Dark-energy and curvature scenarios, and primordial perturbation models are considered. To that end, we calculate the Bayesian evidence in favour of each model using Population Monte Carlo (PMC), a new adaptive sampling technique which was recently applied in a cosmological context. The Bayesian evidence is immediately available from the PMC sample used for parameter estimation without further computational effort, and it comes with an associated error evaluation. Besides, it provides an unbiased estimator of the evidence after any fixed number of iterations and it is naturally parallelizable, in contrast with MCMC and nested sampling methods. By comparison with analytical predictions for simulated data, we show that our results obtained with PMC are reliable and robust. The variability in the evidence evaluation and the stability for various cases are estimated both from simulations and from data. For the cases we consider, the log-evidence is calculated with a precision of better than 0.08. Using a combined set of recent CMB, SNIa and BAO data, we find inconclusive evidence between flat LambdaCDM and simple dark-energy models. A curved Universe is moderately to strongly disfavoured with respect to a flat cosmology. Using physically well-motivated priors within the slow-roll approximation of inflation, we find a weak preference for a running spectral index. A Harrison-Zel'dovich spectrum is weakly disfavoured. With the current data, tensor modes are not detected; the large prior volume on the tensor-to-scalar ratio r results in moderate evidence in favour of r=0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetics, the science of heredity and variation in living organisms, has a central role in medicine, in breeding crops and livestock, and in studying fundamental topics of biological sciences such as evolution and cell functioning. Currently the field of genetics is under a rapid development because of the recent advances in technologies by which molecular data can be obtained from living organisms. In order that most information from such data can be extracted, the analyses need to be carried out using statistical models that are tailored to take account of the particular genetic processes. In this thesis we formulate and analyze Bayesian models for genetic marker data of contemporary individuals. The major focus is on the modeling of the unobserved recent ancestry of the sampled individuals (say, for tens of generations or so), which is carried out by using explicit probabilistic reconstructions of the pedigree structures accompanied by the gene flows at the marker loci. For such a recent history, the recombination process is the major genetic force that shapes the genomes of the individuals, and it is included in the model by assuming that the recombination fractions between the adjacent markers are known. The posterior distribution of the unobserved history of the individuals is studied conditionally on the observed marker data by using a Markov chain Monte Carlo algorithm (MCMC). The example analyses consider estimation of the population structure, relatedness structure (both at the level of whole genomes as well as at each marker separately), and haplotype configurations. For situations where the pedigree structure is partially known, an algorithm to create an initial state for the MCMC algorithm is given. Furthermore, the thesis includes an extension of the model for the recent genetic history to situations where also a quantitative phenotype has been measured from the contemporary individuals. In that case the goal is to identify positions on the genome that affect the observed phenotypic values. This task is carried out within the Bayesian framework, where the number and the relative effects of the quantitative trait loci are treated as random variables whose posterior distribution is studied conditionally on the observed genetic and phenotypic data. In addition, the thesis contains an extension of a widely-used haplotyping method, the PHASE algorithm, to settings where genetic material from several individuals has been pooled together, and the allele frequencies of each pool are determined in a single genotyping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Monte Carlo simulation of Ising chains with competing short-range and infiniterange interactions has been carried out. Results show that whenever the system does not enter a metastable state, variation of temperature brings about phase transitions in the Ising chain. These phase transitions, except for two sets of interaction strengths, are generally of higher order and involve changes in the long-range order while the short-range order remains unaffected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dynamics of low-density flows is governed by the Boltzmann equation of the kinetic theory of gases. This is a nonlinear integro-differential equation and, in general, numerical methods must be used to obtain its solution. The present paper, after a brief review of Direct Simulation Monte Carlo (DSMC) methods due to Bird, and Belotserkovskii and Yanitskii, studies the details of theDSMC method of Deshpande for mono as well as multicomponent gases. The present method is a statistical particle-in-cell method and is based upon the Kac-Prigogine master equation which reduces to the Boltzmann equation under the hypothesis of molecular chaos. The proposed Markoff model simulating the collisions uses a Poisson distribution for the number of collisions allowed in cells into which the physical space is divided. The model is then extended to a binary mixture of gases and it is shown that it is necessary to perform the collisions in a certain sequence to obtain unbiased simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on an isothermal, isobaric simulation the structure and properties of the plastic crystalline phases of C60 and neopentane have been examined. Instantaneous cooling of the plastic crystalline phases of both C60 and neopentane leads to orientational glassy phases. These are accompanied by significant slowing down of reorientational motion. Constant pressure quench experiments on C60 yield a glass transition temperature of around 80 K.