75 resultados para Monte-carlo Simulations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have studied the synergetic effect of confinement (carbon nanopore size) and surface chemistry (the number of carbonyl groups) on CO2 capture from its mixtures with CH4 at typical operating conditions for industrial adsorptive separation (298 K and compressed CO2CH4 mixtures). Although both confinement and surface oxidation have an impact on the efficiency of CO2/CH4 adsorptive separation at thermodynamics equilibrium, we show that surface functionalization is the most important factor in designing an efficient adsorbent for CO2 capture. Systematic Monte Carlo simulations revealed that adsorption of CH4 either pure or mixed with CO2 on oxidized nanoporous carbons is only slightly increased by the presence of functional groups (surface dipoles). In contrast, adsorption of CO2 is very sensitive to the number of carbonyl groups, which can be examined by a strong electric quadrupolar moment of CO2. Interestingly, the adsorbed amount of CH4 is strongly affected by the presence of the co-adsorbed CO2. In contrast, the CO2 uptake does not depend on the molar ratio of CH4 in the bulk mixture. The optimal carbonaceous porous adsorbent used for CO2 capture near ambient conditions should consist of narrow carbon nanopores with oxidized pore walls. Furthermore, the equilibrium separation factor was the greatest for CO2/CH4 mixtures with a low CO2 concentration. The maximum equilibrium separation factor of CO2 over CH4 of ∼18–20 is theoretically predicted for strongly oxidized nanoporous carbons. Our findings call for a review of the standard uncharged model of carbonaceous materials used for the modeling of the adsorption separation processes of gas mixtures containing CO2 (and other molecules with strong electric quadrupolar moment or dipole moment).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to apply the mis-specification (M-S) encompassing perspective to the problem of choosing between linear and log-linear unit-root models. A simple M-S encompassing test, based on an auxiliary regression stemming from the conditional second moment, is proposed and its empirical size and power are investigated using Monte Carlo simulations. It is shown that by focusing on the conditional process the sampling distributions of the relevant statistics are well behaved under both the null and alternative hypotheses. The proposed M-S encompassing test is illustrated using US total disposable income quarterly data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although difference-stationary (DS) and trend-stationary (TS) processes have been subject to considerable analysis, there are no direct comparisons for each being the data-generation process (DGP). We examine incorrect choice between these models for forecasting for both known and estimated parameters. Three sets of Monte Carlo simulations illustrate the analysis, to evaluate the biases in conventional standard errors when each model is mis-specified, compute the relative mean-square forecast errors of the two models for both DGPs, and investigate autocorrelated errors, so both models can better approximate the converse DGP. The outcomes are surprisingly different from established results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compare linear autoregressive (AR) models and self-exciting threshold autoregressive (SETAR) models in terms of their point forecast performance, and their ability to characterize the uncertainty surrounding those forecasts, i.e. interval or density forecasts. A two-regime SETAR process is used as the data-generating process in an extensive set of Monte Carlo simulations, and we consider the discriminatory power of recently developed methods of forecast evaluation for different degrees of non-linearity. We find that the interval and density evaluation methods are unlikely to show the linear model to be deficient on samples of the size typical for macroeconomic data

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure is presented for fitting incoherent scatter radar data from non-thermal F-region ionospheric plasma, using theoretical spectra previously predicted. It is found that values of the shape distortion factor D∗, associated with deviations of the ion velocity distribution from a Maxwellian distribution, and ion temperatures can be deduced (the results being independent of the path of iteration) if the angle between the line-of-sight and the geomagnetic field is larger than about 15–20°. The procedure can be used with one or both of two sets of assumptions. These concern the validity of the adopted model for the line-of-sight ion velocity distribution in the one case or for the full three-dimensional ion velocity distribution function in the other. The distribution function employed was developed to describe the line-of-sight velocity distribution for large aspect angles, but both experimental data and Monte Carlo simulations indicate that the form of the field-perpendicular distribution can also describe the distribution at more general aspect angles. The assumption of this form for the line-of-sight velocity distribution at a general aspect angle enables rigorous derivation of values of the one-dimensional, line-of-sight ion temperature. With some additional assumptions (principally that the field-parallel distribution is always Maxwellian and there is a simple relationship between the ion temperature anisotropy and the distortion of the field-perpendicular distribution from a Maxwellian), fits to data for large aspect angles enable determination of line-of-sight temperatures at all aspect angles and hence, of the average ion temperature and the ion temperature anisotropy. For small aspect angles, the analysis is restricted to the determination of the line-of-sight ion temperature because the theoretical spectrum is insensitive to non-thermal effects when the plasma is viewed along directions almost parallel to the magnetic field. This limitation is expected to apply to any realistic model of the ion velocity distribution function and its consequences are discussed. Fit strategies which allow for mixed ion composition are also considered. Examples of fits to data from various EISCAT observing programmes are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years an increasing number of papers have employed meta-analysis to integrate effect sizes of researchers’ own series of studies within a single paper (“internal meta-analysis”). Although this approach has the obvious advantage of obtaining narrower confidence intervals, we show that it could inadvertently inflate false-positive rates if researchers are motivated to use internal meta-analysis in order to obtain a significant overall effect. Specifically, if one decides whether to stop or continue a further replication experiment depending on the significance of the results in an internal meta-analysis, false-positive rates would increase beyond the nominal level. We conducted a set of Monte-Carlo simulations to demonstrate our argument, and provided a literature review to gauge awareness and prevalence of this issue. Furthermore, we made several recommendations when using internal meta-analysis to make a judgment on statistical significance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structure of a ferrofluid under the influence of an external magnetic field is expected to become anisotropic due to the alignment of the dipoles into the direction of the external field, and subsequently to the formation of particle chains due to the attractive head to tail orientations of the ferrofluid particles. Knowledge about the structure of a colloidal ferrofluid can be inferred from scattering data via the measurement of structure factors. We have used molecular-dynamics simulations to investigate the structure of both monodispersed and polydispersed ferrofluids. The results for the isotropic structure factor for monodispersed samples are similar to previous data by Camp and Patey that were obtained using an alternative Monte Carlo simulation technique, but in a different parameter region. Here we look in addition at bidispersed samples and compute the anisotropic structure factor by projecting the q vector onto the XY and XZ planes separately, when the magnetic field was applied along the z axis. We observe that the XY- plane structure factor as well as the pair distribution functions are quite different from those obtained for the XZ plane. Further, the two- dimensional structure factor patterns are investigated for both monodispersed and bidispersed samples under different conditions. In addition, we look at the scaling exponents of structure factors. Our results should be of value to interpret scattering data on ferrofluids obtained under the influence of an external field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows that radiometer channel radiances for cloudy atmospheric conditions can be simulated with an optimised frequency grid derived under clear-sky conditions. A new clear-sky optimised grid is derived for AVHRR channel 5 ð12 m m, 833 cm �1 Þ. For HIRS channel 11 ð7:33 m m, 1364 cm �1 Þ and AVHRR channel 5, radiative transfer simulations using an optimised frequency grid are compared with simulations using a reference grid, where the optimised grid has roughly 100–1000 times less frequencies than the full grid. The root mean square error between the optimised and the reference simulation is found to be less than 0.3 K for both comparisons, with the magnitude of the bias less than 0.03 K. The simulations have been carried out with the radiative transfer model Atmospheric Radiative Transfer Simulator (ARTS), version 2, using a backward Monte Carlo module for the treatment of clouds. With this module, the optimised simulations are more than 10 times faster than the reference simulations. Although the number of photons is the same, the smaller number of frequencies reduces the overhead for preparing the optical properties for each frequency. With deterministic scattering solvers, the relative decrease in runtime would be even more. The results allow for new radiative transfer applications, such as the development of new retrievals, because it becomes much quicker to carry out a large number of simulations. The conclusions are applicable to any downlooking infrared radiometer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Water quality models generally require a relatively large number of parameters to define their functional relationships, and since prior information on parameter values is limited, these are commonly defined by fitting the model to observed data. In this paper, the identifiability of water quality parameters and the associated uncertainty in model simulations are investigated. A modification to the water quality model `Quality Simulation Along River Systems' is presented in which an improved flow component is used within the existing water quality model framework. The performance of the model is evaluated in an application to the Bedford Ouse river, UK, using a Monte-Carlo analysis toolbox. The essential framework of the model proved to be sound, and calibration and validation performance was generally good. However some supposedly important water quality parameters associated with algal activity were found to be completely insensitive, and hence non-identifiable, within the model structure, while others (nitrification and sedimentation) had optimum values at or close to zero, indicating that those processes were not detectable from the data set examined. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The identification of signatures of natural selection in genomic surveys has become an area of intense research, stimulated by the increasing ease with which genetic markers can be typed. Loci identified as subject to selection may be functionally important, and hence (weak) candidates for involvement in disease causation. They can also be useful in determining the adaptive differentiation of populations, and exploring hypotheses about speciation. Adaptive differentiation has traditionally been identified from differences in allele frequencies among different populations, summarised by an estimate of F-ST. Low outliers relative to an appropriate neutral population-genetics model indicate loci subject to balancing selection, whereas high outliers suggest adaptive (directional) selection. However, the problem of identifying statistically significant departures from neutrality is complicated by confounding effects on the distribution of F-ST estimates, and current methods have not yet been tested in large-scale simulation experiments. Here, we simulate data from a structured population at many unlinked, diallelic loci that are predominantly neutral but with some loci subject to adaptive or balancing selection. We develop a hierarchical-Bayesian method, implemented via Markov chain Monte Carlo (MCMC), and assess its performance in distinguishing the loci simulated under selection from the neutral loci. We also compare this performance with that of a frequentist method, based on moment-based estimates of F-ST. We find that both methods can identify loci subject to adaptive selection when the selection coefficient is at least five times the migration rate. Neither method could reliably distinguish loci under balancing selection in our simulations, even when the selection coefficient is twenty times the migration rate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article introduces a new general method for genealogical inference that samples independent genealogical histories using importance sampling (IS) and then samples other parameters with Markov chain Monte Carlo (MCMC). It is then possible to more easily utilize the advantages of importance sampling in a fully Bayesian framework. The method is applied to the problem of estimating recent changes in effective population size from temporally spaced gene frequency data. The method gives the posterior distribution of effective population size at the time of the oldest sample and at the time of the most recent sample, assuming a model of exponential growth or decline during the interval. The effect of changes in number of alleles, number of loci, and sample size on the accuracy of the method is described using test simulations, and it is concluded that these have an approximately equivalent effect. The method is used on three example data sets and problems in interpreting the posterior densities are highlighted and discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analyses of high-density single-nucleotide polymorphism (SNP) data, such as genetic mapping and linkage disequilibrium (LD) studies, require phase-known haplotypes to allow for the correlation between tightly linked loci. However, current SNP genotyping technology cannot determine phase, which must be inferred statistically. In this paper, we present a new Bayesian Markov chain Monte Carlo (MCMC) algorithm for population haplotype frequency estimation, particulary in the context of LD assessment. The novel feature of the method is the incorporation of a log-linear prior model for population haplotype frequencies. We present simulations to suggest that 1) the log-linear prior model is more appropriate than the standard coalescent process in the presence of recombination (>0.02cM between adjacent loci), and 2) there is substantial inflation in measures of LD obtained by a "two-stage" approach to the analysis by treating the "best" haplotype configuration as correct, without regard to uncertainty in the recombination process. Genet Epidemiol 25:106-114, 2003. (C) 2003 Wiley-Liss, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The hybrid Monte Carlo (HMC) method is a popular and rigorous method for sampling from a canonical ensemble. The HMC method is based on classical molecular dynamics simulations combined with a Metropolis acceptance criterion and a momentum resampling step. While the HMC method completely resamples the momentum after each Monte Carlo step, the generalized hybrid Monte Carlo (GHMC) method can be implemented with a partial momentum refreshment step. This property seems desirable for keeping some of the dynamic information throughout the sampling process similar to stochastic Langevin and Brownian dynamics simulations. It is, however, ultimate to the success of the GHMC method that the rejection rate in the molecular dynamics part is kept at a minimum. Otherwise an undesirable Zitterbewegung in the Monte Carlo samples is observed. In this paper, we describe a method to achieve very low rejection rates by using a modified energy, which is preserved to high-order along molecular dynamics trajectories. The modified energy is based on backward error results for symplectic time-stepping methods. The proposed generalized shadow hybrid Monte Carlo (GSHMC) method is applicable to NVT as well as NPT ensemble simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the orientational ordering on the surface of a sphere using Monte Carlo and Brownian dynamics simulations of rods interacting with an anisotropic potential. We restrict the orientations to the local tangent plane of the spherical surface and fix the position of each rod to be at a discrete point on the spherical surface. On the surface of a sphere, orientational ordering cannot be perfectly nematic due to the inevitable presence of defects. We find that the ground state of four +1/2 point defects is stable across a broad range of temperatures. We investigate the transition from disordered to ordered phase by decreasing the temperature and find a very smooth transition. We use fluctuations of the local directors to estimate the Frank elastic constant on the surface of a sphere and compare it to the planar case. We observe subdiffusive behavior in the mean square displacement of the defect cores and estimate their diffusion constants.