966 resultados para Monte-carlo Calculations
Resumo:
The formation of complexes appearing in solutions containing oppositely charged polyelectrolytes has been investigated by Monte Carlo simulations using two different models. The polyions are described as flexible chains of 20 connected charged hard spheres immersed in a homogenous dielectric background representing water. The small ions are either explicitly included or their effect described by using a screened Coulomb potential. The simulated solutions contained 10 positively charged polyions with 0, 2, or 5 negatively charged polyions and the respective counterions. Two different linear charge densities were considered, and structure factors, radial distribution functions, and polyion extensions were determined. A redistribution of positively charged polyions involving strong complexes formed between the oppositely charged polyions appeared as the number of negatively charged polyions was increased. The nature of the complexes was found to depend on the linear charge density of the chains. The simplified model involving the screened Coulomb potential gave qualitatively similar results as the model with explicit small ions. Finally, owing to the complex formation, the sampling in configurational space is nontrivial, and the efficiency of different trial moves was examined.
Resumo:
The dependency of the blood oxygenation level dependent (BOLD) signal on underlying hemodynamics is not well understood. Building a forward biophysical model of this relationship is important for the quantitative estimation of the hemodynamic changes and neural activity underlying functional magnetic resonance imaging (fMRI) signals. We have developed a general model of the BOLD signal which can model both intra- and extravascular signals for an arbitrary tissue model across a wide range of imaging parameters. The model of the BOLD signal was instantiated as a look-up-table (LuT), and was verified against concurrent fMRI and optical imaging measurements of activation induced hemodynamics. Magn Reson Med, 2008. © 2008 Wiley-Liss, Inc.
Resumo:
This paper employs an extensive Monte Carlo study to test the size and power of the BDS and close return methods of testing for departures from independent and identical distribution. It is found that the finite sample properties of the BDS test are far superior and that the close return method cannot be recommended as a model diagnostic. Neither test can be reliably used for very small samples, while the close return test has low power even at large sample sizes
Resumo:
Monte Carlo algorithms often aim to draw from a distribution π by simulating a Markov chain with transition kernel P such that π is invariant under P. However, there are many situations for which it is impractical or impossible to draw from the transition kernel P. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace P by an approximation Pˆ. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ’close’ the chain given by the transition kernel Pˆ is to the chain given by P . We apply these results to several examples from spatial statistics and network analysis.
Resumo:
The Monte Carlo Independent Column Approximation (McICA) is a flexible method for representing subgrid-scale cloud inhomogeneity in radiative transfer schemes. It does, however, introduce conditional random errors but these have been shown to have little effect on climate simulations, where spatial and temporal scales of interest are large enough for effects of noise to be averaged out. This article considers the effect of McICA noise on a numerical weather prediction (NWP) model, where the time and spatial scales of interest are much closer to those at which the errors manifest themselves; this, as we show, means that noise is more significant. We suggest methods for efficiently reducing the magnitude of McICA noise and test these methods in a global NWP version of the UK Met Office Unified Model (MetUM). The resultant errors are put into context by comparison with errors due to the widely used assumption of maximum-random-overlap of plane-parallel homogeneous cloud. For a simple implementation of the McICA scheme, forecasts of near-surface temperature are found to be worse than those obtained using the plane-parallel, maximum-random-overlap representation of clouds. However, by applying the methods suggested in this article, we can reduce noise enough to give forecasts of near-surface temperature that are an improvement on the plane-parallel maximum-random-overlap forecasts. We conclude that the McICA scheme can be used to improve the representation of clouds in NWP models, with the provision that the associated noise is sufficiently small.
Resumo:
We discuss the basic hydrodynamics that determines the density structure of the disks around hot stars. Observational evidence supports the idea that these disks are Keplerian (rotationally supported) gaseous disks. A popular scenario in the literature, which naturally leads to the formation of Keplerian disks, is the viscous decretion model. According to this scenario, the disks are hydrostatically supported in the vertical direction, while the radial structure is governed by the viscous transport. This suggests that the temperature is one primary factor that governs the disk density structure. In a previous study we demonstrated, using three-dimensional non-LTE Monte Carlo simulations, that viscous Keplerian disks can be highly nonisothermal. In this paper we build on our previous work and solve the full problem of the steady state nonisothermal viscous diffusion and vertical hydrostatic equilibrium. We find that the self-consistent solution departs significantly from the analytic isothermal density, with potentially large effects on the emergent spectrum. This implies that nonisothermal disk models must be used for a detailed modeling of Be star disks.
Resumo:
The triple- and quadruple-escape peaks of 6.128 MeV photons from the (19)F(p,alpha gamma)(16)O nuclear reaction were observed in an HPGe detector. The experimental peak areas, measured in spectra projected with a restriction function that allows quantitative comparison of data from different multiplicities, are in reasonably good agreement with those predicted by Monte Carlo simulations done with the general-purpose radiation-transport code PENELOPE. The behaviour of the escape intensities was simulated for some gamma-ray energies and detector dimensions; the results obtained can be extended to other energies using an empirical function and statistical properties related to the phenomenon. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We describe the canonical and microcanonical Monte Carlo algorithms for different systems that can be described by spin models. Sites of the lattice, chosen at random, interchange their spin values, provided they are different. The canonical ensemble is generated by performing exchanges according to the Metropolis prescription whereas in the microcanonical ensemble, exchanges are performed as long as the total energy remains constant. A systematic finite size analysis of intensive quantities and a comparison with results obtained from distinct ensembles are performed and the quality of results reveal that the present approach may be an useful tool for the study of phase transitions, specially first-order transitions. (C) 2009 Elsevier B.V. All rights reserved.
Chinese Basic Pension Substitution Rate: A Monte Carlo Demonstration of the Individual Account Model
Resumo:
At the end of 2005, the State Council of China passed ”The Decision on adjusting the Individual Account of Basic Pension System”, which adjusted the individual account in the 1997 basic pension system. In this essay, we will analyze the adjustment above, and use Life Annuity Actuarial Theory to establish the basic pension substitution rate model. Monte Carlo simulation is also used to prove the rationality of the model. Some suggestions are put forward associated with the substitution rate according to the current policy.
Resumo:
Neste trabalho, analisamos utilização da metodologia CreditRLsk+ do Credit Suisse sua adequação ao mercado brasileiro, com objetivo de calcular risco de uma carteira de crédito. Certas hipóteses assumidas na formulação do modelo CreditRisk+ não valem para o mercado brasileiro, caracterizado, por exemplo, por uma elevada probabilidade de defcnilt. Desenvolvemos, então, uma metodologia para cálculo da distribuição de perdas através do método de Simulação de Monte Cario, alterando algumas hipóteses originais do modelo com objetivo de adaptá-lo ao nosso mercado. utilização de simulações também oferece resultados mais precisos em situações onde as carteiras possuem uma pequena população de contratos, além de eliminar possíveis problemas de convergência do método analítico, mesmo considerando as hipóteses do modelo original. Verifica-se ainda que tempo computacional pode ser menor que da metodologia original, principalmente em carteiras com elevado número de devedores de perfis distintos com alocações em diversos setores da economia. Tendo em vista as restrições acima, acreditamos que metodologia proposta seja uma alternativa para forma analítica do modelo CreditRisk+. Apresentamos exemplos de utilização resultados providos por estas simulações. ponto central deste trabalho realçar importância da utilização de metodologias alternativas de medição de risco de crédito que incorporem as particularidades do mercado brasileiro.
Resumo:
O presente trabalho tem por objetivo descrever, avaliar comparar as metodologias analítica da simulação Monte Cario para cálculo do Value at Risk (Valor em Risco) de instituições financeiras de empresas. Para comparar as vantagens desvantagens de cada metodologia, efetuaremos comparações algébricas realizamos diversos testes empíricos com instituições hipotéticas que apresentassem diferentes níveis de alavancagem de composição em seus balanços, que operassem em diferentes mercados (consideramos os mercados de ações, de opções de compra de títulos de renda fixa prefixados).
Resumo:
Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
Resumo:
A atividade de crédito consiste, em termos financeiros, na entrega de um valor presente mediante uma promessa de pagamento em data futura. Uma vez que se trata de promessa de pagamento, existe a possibilidade de que a mesma não seja cumprida, dando origem ao risco de crédito. Dada a existência desse risco, cabe às instituições que concedem crédito realizar uma análise do proponente, buscando aferir se o mesmo possui condições de cumprir o compromisso assumido. Um dos aspectos a ser levado em consideração nessa análise diz respeito à capacidade de pagamento da empresa, realizada através de projeções do fluxo de caixa da mesma. Com base nisso, o presente trabalho teve por objetivo a construção de um modelo de simulação para avaliar a capacidade de pagamento de empresas em financiamentos de longo prazo. Entende-se que a aplicação de um modelo dessa natureza pode trazer informações adicionais relevantes para o problema em análise. A avaliação do modelo construído, realizada através de consulta a experts no problema de decisão de crédito, aponta que tanto a técnica utilizada quanto o modelo construído mostram-se adequados e aderentes à realidade, propiciando uma maior segurança para a tomada de decisão.