976 resultados para Monte-carlo Simulations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse, composée de quatre articles scientifiques, porte sur les méthodes numériques atomistiques et leur application à des systèmes semi-conducteurs nanostructurés. Nous introduisons les méthodes accélérées conçues pour traiter les événements activés, faisant un survol des développements du domaine. Suit notre premier article, qui traite en détail de la technique d'activation-relaxation cinétique (ART-cinétique), un algorithme Monte Carlo cinétique hors-réseau autodidacte basé sur la technique de l'activation-relaxation nouveau (ARTn), dont le développement ouvre la voie au traitement exact des interactions élastiques tout en permettant la simulation de matériaux sur des plages de temps pouvant atteindre la seconde. Ce développement algorithmique, combiné à des données expérimentales récentes, ouvre la voie au second article. On y explique le relâchement de chaleur par le silicium cristallin suite à son implantation ionique avec des ions de Si à 3 keV. Grâce à nos simulations par ART-cinétique et l'analyse de données obtenues par nanocalorimétrie, nous montrons que la relaxation est décrite par un nouveau modèle en deux temps: "réinitialiser et relaxer" ("Replenish-and-Relax"). Ce modèle, assez général, peut potentiellement expliquer la relaxation dans d'autres matériaux désordonnés. Par la suite, nous poussons l'analyse plus loin. Le troisième article offre une analyse poussée des mécanismes atomistiques responsables de la relaxation lors du recuit. Nous montrons que les interactions élastiques entre des défauts ponctuels et des petits complexes de défauts contrôlent la relaxation, en net contraste avec la littérature qui postule que des "poches amorphes" jouent ce rôle. Nous étudions aussi certains sous-aspects de la croissance de boîtes quantiques de Ge sur Si (001). En effet, après une courte mise en contexte et une introduction méthodologique supplémentaire, le quatrième article décrit la structure de la couche de mouillage lors du dépôt de Ge sur Si (001) à l'aide d'une implémentation QM/MM du code BigDFT-ART. Nous caractérisons la structure de la reconstruction 2xN de la surface et abaissons le seuil de la température nécessaire pour la diffusion du Ge en sous-couche prédit théoriquement par plus de 100 K.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le travail de modélisation a été réalisé à travers EGSnrc, un logiciel développé par le Conseil National de Recherche Canada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Monte Carlo study of the late time growth of L12-ordered domains in a fcc A3B binary alloy is presented. The energy of the alloy has been modeled by a nearest-neighbor interaction Ising Hamiltonian. The system exhibits a fourfold degenerated ground state and two kinds of interfaces separating ordered domains: flat and curved antiphase boundaries. Two different dynamics are used in the simulations: the standard atom-atom exchange mechanism and the more realistic vacancy-atom exchange mechanism. The results obtained by both methods are compared. In particular we study the time evolution of the excess energy, the structure factor and the mean distance between walls. In the case of atom-atom exchange mechanism anisotropic growth has been found: two characteristic lengths are needed in order to describe the evolution. Contrarily, with the vacancyatom exchange mechanism scaling with a single length holds. Results are contrasted with existing experiments in Cu3Au and theories for anisotropic growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of forecast ensembles to probabilistic weather prediction has spurred considerable interest in their evaluation. Such ensembles are commonly interpreted as Monte Carlo ensembles meaning that the ensemble members are perceived as random draws from a distribution. Under this interpretation, a reasonable property to ask for is statistical consistency, which demands that the ensemble members and the verification behave like draws from the same distribution. A widely used technique to assess statistical consistency of a historical dataset is the rank histogram, which uses as a criterion the number of times that the verification falls between pairs of members of the ordered ensemble. Ensemble evaluation is rendered more specific by stratification, which means that ensembles that satisfy a certain condition (e.g., a certain meteorological regime) are evaluated separately. Fundamental relationships between Monte Carlo ensembles, their rank histograms, and random sampling from the probability simplex according to the Dirichlet distribution are pointed out. Furthermore, the possible benefits and complications of ensemble stratification are discussed. The main conclusion is that a stratified Monte Carlo ensemble might appear inconsistent with the verification even though the original (unstratified) ensemble is consistent. The apparent inconsistency is merely a result of stratification. Stratified rank histograms are thus not necessarily flat. This result is demonstrated by perfect ensemble simulations and supplemented by mathematical arguments. Possible methods to avoid or remove artifacts that stratification induces in the rank histogram are suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Monte Carlo Independent Column Approximation (McICA) is a flexible method for representing subgrid-scale cloud inhomogeneity in radiative transfer schemes. It does, however, introduce conditional random errors but these have been shown to have little effect on climate simulations, where spatial and temporal scales of interest are large enough for effects of noise to be averaged out. This article considers the effect of McICA noise on a numerical weather prediction (NWP) model, where the time and spatial scales of interest are much closer to those at which the errors manifest themselves; this, as we show, means that noise is more significant. We suggest methods for efficiently reducing the magnitude of McICA noise and test these methods in a global NWP version of the UK Met Office Unified Model (MetUM). The resultant errors are put into context by comparison with errors due to the widely used assumption of maximum-random-overlap of plane-parallel homogeneous cloud. For a simple implementation of the McICA scheme, forecasts of near-surface temperature are found to be worse than those obtained using the plane-parallel, maximum-random-overlap representation of clouds. However, by applying the methods suggested in this article, we can reduce noise enough to give forecasts of near-surface temperature that are an improvement on the plane-parallel maximum-random-overlap forecasts. We conclude that the McICA scheme can be used to improve the representation of clouds in NWP models, with the provision that the associated noise is sufficiently small.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe the canonical and microcanonical Monte Carlo algorithms for different systems that can be described by spin models. Sites of the lattice, chosen at random, interchange their spin values, provided they are different. The canonical ensemble is generated by performing exchanges according to the Metropolis prescription whereas in the microcanonical ensemble, exchanges are performed as long as the total energy remains constant. A systematic finite size analysis of intensive quantities and a comparison with results obtained from distinct ensembles are performed and the quality of results reveal that the present approach may be an useful tool for the study of phase transitions, specially first-order transitions. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the belief, supported byrecentapplied research, thataggregate datadisplay short-run comovement, there has been little discussion about the econometric consequences ofthese data “features.” W e use exhaustive M onte-Carlo simulations toinvestigate theimportance ofrestrictions implied by common-cyclicalfeatures for estimates and forecasts based on vectorautoregressive and errorcorrection models. First, weshowthatthe“best” empiricalmodeldevelopedwithoutcommoncycles restrictions neednotnestthe“best” modeldevelopedwiththoserestrictions, duetothe use ofinformation criteria forchoosingthe lagorderofthe twoalternative models. Second, weshowthatthecosts ofignoringcommon-cyclicalfeatures inV A R analysis may be high in terms offorecastingaccuracy and e¢ciency ofestimates ofvariance decomposition coe¢cients. A lthough these costs are more pronounced when the lag orderofV A R modelsareknown, theyarealsonon-trivialwhenitis selectedusingthe conventionaltoolsavailabletoappliedresearchers. T hird, we…ndthatifthedatahave common-cyclicalfeatures andtheresearcherwants touseaninformationcriterium to selectthelaglength, theH annan-Q uinn criterium is themostappropriate, sincethe A kaike and theSchwarz criteriahave atendency toover- and under-predictthe lag lengthrespectivelyinoursimulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using data from a single simulation we obtain Monte Carlo renormalization-group information in a finite region of parameter space by adapting the Ferrenberg-Swendsen histogram method. Several quantities are calculated in the two-dimensional N 2 Ashkin-Teller and Ising models to show the feasibility of the method. We show renormalization-group Hamiltonian flows and critical-point location by matching of correlations by doing just two simulations at a single temperature in lattices of different sizes to partially eliminate finite-size effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The understanding of electrostatic interactions is an essential aspect of the complex correlation between structure and function of biological macromolecules. It is also important in protein engineering and design. Theoretical studies of such interactions are predominantly done within the framework of Debye-Huckel theory. A classical example is the Tanford-Kirkwood (TK) model. Besides other limitations, this model assumes an infinitesimally small macromolecule concentration. By comparison to Monte Carlo (MC) simulations, it is shown that TK predictions for the shifts in ion binding constants upon addition of salt become less reliable even at moderately macromolecular concentrations. A simple modification based on colloidal literature is suggested to the TK scheme. The modified TK models suggested here satisfactorily predict MC and experimental shifts in the calcium binding constant as a function of protein concentration for the calbindin D-9k mutant and calmodulin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a new reverse Monte Carlo algorithm, we present simulations that reproduce very well several structural and thermodynamic properties of liquid water. Both Monte Carlo, molecular dynamics simulations and experimental radial distribution functions used as input are accurately reproduced using a small number of molecules and no external constraints. Ad hoc energy and hydrogen bond analysis show the physical consistency and limitations of the generated RMC configurations. (C) 2001 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Utilizou-se o método seqüencial Monte Carlo / Mecânica Quântica para obterem-se os desvios de solvatocromismo e os momentos de dipolo dos sistemas de moléculas orgânicas: Uracil em meio aquoso, -Caroteno em Ácido Oléico, Ácido Ricinoléico em metanol e em Etanol e Ácido Oléico em metanol e em Etanol. As otimizações das geometrias e as distribuições de cargas foram obtidas através da Teoria do Funcional Densidade com o funcional B3LYP e os conjuntos de funções de base 6-31G(d) para todas as moléculas exceto para a água e Uracil, as quais, foram utilizadas o conjunto de funções de base 6-311++G(d,p). No tratamento clássico, Monte Carlo, aplicou-se o algoritmo Metropólis através do programa DICE. A separação de configurações estatisticamente relevantes para os cálculos das propriedades médias foi implementada com a utilização da função de auto-correlação calculada para cada sistema. A função de distribuição radial dos líquidos moleculares foi utilizada para a separação da primeira camada de solvatação, a qual, estabelece a principal interação entre soluto-solvente. As configurações relevantes da primeira camada de solvatação de cada sistema foram submetidas a cálculos quânticos a nível semi-empírico com o método ZINDO/S-CI. Os espectros de absorção foram obtidos para os solutos em fase gasosa e para os sistemas de líquidos moleculares comentados. Os momentos de dipolo elétrico dos mesmos também foram obtidos. Todas as bandas dos espectros de absorção dos sistemas tiveram um desvio para o azul, exceto a segunda banda do sistema de Beta-Caroteno em Ácido Oléico que apresentou um desvio para o vermelho. Os resultados encontrados apresentam-se em excelente concordância com os valores experimentais encontrados na literatura. Todos os sistemas tiveram aumento no momento de dipolo elétrico devido às moléculas dos solventes serem moléculas polares. Os sistemas de ácidos graxos em álcoois apresentaram resultados muito semelhantes, ou seja, os ácidos graxos mencionados possuem comportamentos espectroscópicos semelhantes submetidos aos mesmos solventes. As simulações através do método seqüencial Monte Carlo / Mecânica Quântica estudadas demonstraram que a metodologia é eficaz para a obtenção das propriedades espectroscópicas dos líquidos moleculares analisados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The huge demand for procedures involving ionizing radiation promotes the need for safe methods of experimentation considering the danger of their biological e ects with consequent risk to humans. Brazilian's legislation prohibits experiments involving this type of radiation in humans through Decree 453 of Ministry of Health with determines that such procedures comply with the principles of justi cation, optimization and dose limitation. In this line, concurrently with the advancement of available computer processing power, computing simulations have become relevant in those situations where experimental procedures are too cost or impractical. The Monte Carlo method, created along the Manhattan Project duringWorldWar II, is a powerful strategy to simulations in computational physics. In medical physics, this technique has been extensively used with applications in diagnostics and cancer treatment. The objective of this work is to simulate the production and detection of X-rays for the energy range of diagnostic radiology, for molybdenum target, using the Geant4 toolkit. X-ray tubes with this kind of target material are used in diagnostic radiology, speci cally in mammography, one of the most used techniques for screening of breast cancer in women. During the simulations, we used di erent models for bremsstrahlung available in physical models for low energy, in situations already covered by the literature in earlier versions of Geant4. Our results show that although the physical situations seems qualitatively adequate, quantitative comparisons to available analytical data shows aws in the code of Geant4 Low Energy source

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Física - IFT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The extrapolation chamber is a parallel-plate ionization chamber that allows variation of its air-cavity volume. In this work, an experimental study and MCNP-4C Monte Carlo code simulations of an ionization chamber designed and constructed at the Calibration Laboratory at IFEN to be used as a secondary dosimetry standard for low-energy X-rays are reported. The results obtained were within the international recommendations, and the simulations showed that the components of the extrapolation chamber may influence its response up to 11.0%. (C) 2011 Elsevier Ltd. All rights reserved.