66 resultados para Monte Carlo Simulations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems of two-dimensional hard ellipses of varying aspect ratios and packing fractions are studied by Monte Carlo simulations in the generalised canonical ensemble. From this microscopic model, we extract a coarse-grained macroscopic Landau-de Gennes free energy as a function of packing fraction and orientational order parameter. We separate the free energy into the ideal orientational entropy of non-interacting two-dimensional spins and an excess free energy associated with excluded volume interactions. We further explore the isotropic-nematic phase transition using our empirical expression for the free energy and find that the nature of the phase transition is continuous for the aspect ratios we studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo algorithms often aim to draw from a distribution π by simulating a Markov chain with transition kernel P such that π is invariant under P. However, there are many situations for which it is impractical or impossible to draw from the transition kernel P. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace P by an approximation Pˆ. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ’close’ the chain given by the transition kernel Pˆ is to the chain given by P . We apply these results to several examples from spatial statistics and network analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology for discovering the mechanisms and dynamics of protein clustering on solid surfaces is presented. In situ atomic force microscopy images are quantitatively compared to Monte Carlo simulations using cluster statistics to differentiate various models. We study lysozyme adsorption on mica as a model system and find that all surface-supported clusters are mobile, not just the monomers, with diffusion constant inversely related to cluster size. The surface monomer diffusion constant is measured to be D1∼9×10-16  cm2 s-1, such a low value being difficult to measure using other techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite its relevance to a wide range of technological and fundamental areas, a quantitative understanding of protein surface clustering dynamics is often lacking. In inorganic crystal growth, surface clustering of adatoms is well described by diffusion-aggregation models. In such models, the statistical properties of the aggregate arrays often reveal the molecular scale aggregation processes. We investigate the potential of these theories to reveal hitherto hidden facets of protein clustering by carrying out concomitant observations of lysozyme adsorption onto mica surfaces, using atomic force microscopy. and Monte Carlo simulations of cluster nucleation and growth. We find that lysozyme clusters diffuse across the substrate at a rate that varies inversely with size. This result suggests which molecular scale mechanisms are responsible for the mobility of the proteins on the substrate. In addition the surface diffusion coefficient of the monomer can also be extracted from the comparison between experiments and simulations. While concentrating on a model system of lysozyme-on-mica, this 'proof of concept' study successfully demonstrates the potential of our approach to understand and influence more biomedically applicable protein-substrate couples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reference model of fallible endgame play is defined in terms of a spectrum of endgame players whose play ranges in competence from the optimal to the anti-optimal choice of move. They may be used as suitably skilled practice partners, to assess a player, to differentiate between otherwise equi-optimal moves, to promote or expedite a game result, to run Monte-Carlo simulations, and to identify the difficulty of a position or a whole endgame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reference model of fallible endgame play is defined in terms of a spectrum of endgame players whose play ranges in competence from random to optimal choice of move. They may be used as suitable practice partners, to differentiate between otherwise equi-optimal moves, to promote or expedite a result, to assess an opponent, to run Monte Carlo simulations, and to identify the difficulty of a position or a whole endgame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have estimated the speed and direction of propagation of a number of Coronal Mass Ejections (CMEs) using single-spacecraft data from the STEREO Heliospheric Imager (HI) wide-field cameras. In general, these values are in good agreement with those predicted by Thernisien, Vourlidas, and Howard in Solar Phys. 256, 111 -aEuro parts per thousand 130 (2009) using a forward modelling method to fit CMEs imaged by the STEREO COR2 coronagraphs. The directions of the CMEs predicted by both techniques are in good agreement despite the fact that many of the CMEs under study travel in directions that cause them to fade rapidly in the HI images. The velocities estimated from both techniques are in general agreement although there are some interesting differences that may provide evidence for the influence of the ambient solar wind on the speed of CMEs. The majority of CMEs with a velocity estimated to be below 400 km s(-1) in the COR2 field of view have higher estimated velocities in the HI field of view, while, conversely, those with COR2 velocities estimated to be above 400 km s(-1) have lower estimated HI velocities. We interpret this as evidence for the deceleration of fast CMEs and the acceleration of slower CMEs by interaction with the ambient solar wind beyond the COR2 field of view. We also show that the uncertainties in our derived parameters are influenced by the range of elongations over which each CME can be tracked. In order to reduce the uncertainty in the predicted arrival time of a CME at 1 Astronomical Unit (AU) to within six hours, the CME needs to be tracked out to at least 30 degrees elongation. This is in good agreement with predictions of the accuracy of our technique based on Monte Carlo simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a method for simulating multivariate samples that have exact means, covariances, skewness and kurtosis. We introduce a new class of rectangular orthogonal matrix which is fundamental to the methodology and we call these matrices L matrices. They may be deterministic, parametric or data specific in nature. The target moments determine the L matrix then infinitely many random samples with the same exact moments may be generated by multiplying the L matrix by arbitrary random orthogonal matrices. This methodology is thus termed “ROM simulation”. Considering certain elementary types of random orthogonal matrices we demonstrate that they generate samples with different characteristics. ROM simulation has applications to many problems that are resolved using standard Monte Carlo methods. But no parametric assumptions are required (unless parametric L matrices are used) so there is no sampling error caused by the discrete approximation of a continuous distribution, which is a major source of error in standard Monte Carlo simulations. For illustration, we apply ROM simulation to determine the value-at-risk of a stock portfolio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of polydispersity on an AB diblock copolymer melt is investigated using latticebased Monte Carlo simulations. We consider melts of symmetric composition, where the B blocks are monodisperse and the A blocks are polydisperse with a Schultz-Zimm distribution. In agreement with experiment and self-consistent field theory (SCFT), we find that polydispersity causes a significant increase in domain size. It also induces a transition from flat to curved interfaces, with the polydisperse blocks residing on the inside of the interfacial curvature. Most importantly, the simulations show a relatively small shift in the order-disorder transition (ODT) in agreement with experiment, whereas SCFT incorrectly predicts a sizable shift towards higher temperatures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying a periodic time-series model from environmental records, without imposing the positivity of the growth rate, does not necessarily respect the time order of the data observations. Consequently, subsequent observations, sampled in the environmental archive, can be inversed on the time axis, resulting in a non-physical signal model. In this paper an optimization technique with linear constraints on the signal model parameters is proposed that prevents time inversions. The activation conditions for this constrained optimization are based upon the physical constraint of the growth rate, namely, that it cannot take values smaller than zero. The actual constraints are defined for polynomials and first-order splines as basis functions for the nonlinear contribution in the distance-time relationship. The method is compared with an existing method that eliminates the time inversions, and its noise sensitivity is tested by means of Monte Carlo simulations. Finally, the usefulness of the method is demonstrated on the measurements of the vessel density, in a mangrove tree, Rhizophora mucronata, and the measurement of Mg/Ca ratios, in a bivalve, Mytilus trossulus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the finite sample properties of model selection by information criteria in conditionally heteroscedastic models. Recent theoretical results show that certain popular criteria are consistent in that they will select the true model asymptotically with probability 1. To examine the empirical relevance of this property, Monte Carlo simulations are conducted for a set of non–nested data generating processes (DGPs) with the set of candidate models consisting of all types of model used as DGPs. In addition, not only is the best model considered but also those with similar values of the information criterion, called close competitors, thus forming a portfolio of eligible models. To supplement the simulations, the criteria are applied to a set of economic and financial series. In the simulations, the criteria are largely ineffective at identifying the correct model, either as best or a close competitor, the parsimonious GARCH(1, 1) model being preferred for most DGPs. In contrast, asymmetric models are generally selected to represent actual data. This leads to the conjecture that the properties of parameterizations of processes commonly used to model heteroscedastic data are more similar than may be imagined and that more attention needs to be paid to the behaviour of the standardized disturbances of such models, both in simulation exercises and in empirical modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given a nonlinear model, a probabilistic forecast may be obtained by Monte Carlo simulations. At a given forecast horizon, Monte Carlo simulations yield sets of discrete forecasts, which can be converted to density forecasts. The resulting density forecasts will inevitably be downgraded by model mis-specification. In order to enhance the quality of the density forecasts, one can mix them with the unconditional density. This paper examines the value of combining conditional density forecasts with the unconditional density. The findings have positive implications for issuing early warnings in different disciplines including economics and meteorology, but UK inflation forecasts are considered as an example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was, within a sensitivity analysis framework, to determine if additional model complexity gives a better capability to model the hydrology and nitrogen dynamics of a small Mediterranean forested catchment or if the additional parameters cause over-fitting. Three nitrogen-models of varying hydrological complexity were considered. For each model, general sensitivity analysis (GSA) and Generalized Likelihood Uncertainty Estimation (GLUE) were applied, each based on 100,000 Monte Carlo simulations. The results highlighted the most complex structure as the most appropriate, providing the best representation of the non-linear patterns observed in the flow and streamwater nitrate concentrations between 1999 and 2002. Its 5% and 95% GLUE bounds, obtained considering a multi-objective approach, provide the narrowest band for streamwater nitrogen, which suggests increased model robustness, though all models exhibit periods of inconsistent good and poor fits between simulated outcomes and observed data. The results confirm the importance of the riparian zone in controlling the short-term (daily) streamwater nitrogen dynamics in this catchment but not the overall flux of nitrogen from the catchment. It was also shown that as the complexity of a hydrological model increases over-parameterisation occurs, but the converse is true for a water quality model where additional process representation leads to additional acceptable model simulations. Water quality data help constrain the hydrological representation in process-based models. Increased complexity was justifiable for modelling river-system hydrochemistry. Increased complexity was justifiable for modelling river-system hydrochemistry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formation of complexes in solutions of oppositely charged polyions has been studied by Monte Carlo simulations. The amount as well as the length, and thus, the absolute charge of one of the polyions have been varied. There is an increasing tendency to form large clusters as the excess of one kind of polyion decreases. When all polyions have the same length, this tendency reaches a maximum near, but off, equivalent amounts of the two types of polyions. When one kind of polyion is made shorter, the propensity to form large clusters decreases and the fluctuations in cluster charge increases. Simple free-energy expressions have been formulated on the basis of a set of simple rules that help rationalize the observations. By calculating cluster distributions in both grand canonical and canonical ensembles, it has been possible to show the extent of finite-size effects in the simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formation of complexes in solutions containing positively charged polyions (polycations) and a variable amount of negatively charged polyions (polyanions) has been investigated by Monte Carlo simulations. The polyions were described as flexible chains of charged hard spheres interacting through a screened Coulomb potential. The systems were analyzed in terms of cluster compositions, structure factors, and radial distribution functions. At 50% charge equivalence or less, complexes involving two polycations and one polyanion were frequent, while closer to charge equivalence, larger clusters were formed. Small and neutral complexes dominated the solution at charge equivalence in a monodisperse system, while larger clusters again dominated the solution when the polyions were made polydisperse. The cluster composition and solution structure were also examined as functions of added salt by varying the electrostatic screening length. The observed formation of clusters could be rationalized by a few simple rules.