927 resultados para Markov Model with Monte-Carlo microsimulations
Resumo:
A numerical study is presented of the third-dimensional Gaussian random-field Ising model at T=0 driven by an external field. Standard synchronous relaxation dynamics is employed to obtain the magnetization versus field hysteresis loops. The focus is on the analysis of the number and size distribution of the magnetization avalanches. They are classified as being nonspanning, one-dimensional-spanning, two-dimensional-spanning, or three-dimensional-spanning depending on whether or not they span the whole lattice in different space directions. Moreover, finite-size scaling analysis enables identification of two different types of nonspanning avalanches (critical and noncritical) and two different types of three-dimensional-spanning avalanches (critical and subcritical), whose numbers increase with L as a power law with different exponents. We conclude by giving a scenario for avalanche behavior in the thermodynamic limit.
Resumo:
Spanning avalanches in the 3D Gaussian Random Field Ising Model (3D-GRFIM) with metastable dynamics at T=0 have been studied. Statistical analysis of the field values for which avalanches occur has enabled a Finite-Size Scaling (FSS) study of the avalanche density to be performed. Furthermore, a direct measurement of the geometrical properties of the avalanches has confirmed an earlier hypothesis that several types of spanning avalanches with two different fractal dimensions coexist at the critical point. We finally compare the phase diagram of the 3D-GRFIM with metastable dynamics with the same model in equilibrium at T=0.
Resumo:
We study the nonequilibrium behavior of the three-dimensional Gaussian random-field Ising model at T=0 in the presence of a uniform external field using a two-spin-flip dynamics. The deterministic, history-dependent evolution of the system is compared with the one obtained with the standard one-spin-flip dynamics used in previous studies of the model. The change in the dynamics yields a significant suppression of coercivity, but the distribution of avalanches (in number and size) stays remarkably similar, except for the largest ones that are responsible for the jump in the saturation magnetization curve at low disorder in the thermodynamic limit. By performing a finite-size scaling study, we find strong evidence that the change in the dynamics does not modify the universality class of the disorder-induced phase transition.
Resumo:
A model for the study of hysteresis and avalanches in a first-order phase transition from a single variant phase to a multivariant phase is presented. The model is based on a modification of the random-field Potts model with metastable dynamics by adding a dipolar interaction term truncated at nearest neighbors. We focus our study on hysteresis loop properties, on the three-dimensional microstructure formation, and on avalanche statistics.
Resumo:
While channel coding is a standard method of improving a system’s energy efficiency in digital communications, its practice does not extend to high-speed links. Increasing demands in network speeds are placing a large burden on the energy efficiency of high-speed links and render the benefit of channel coding for these systems a timely subject. The low error rates of interest and the presence of residual intersymbol interference (ISI) caused by hardware constraints impede the analysis and simulation of coded high-speed links. Focusing on the residual ISI and combined noise as the dominant error mechanisms, this paper analyses error correlation through concepts of error region, channel signature, and correlation distance. This framework provides a deeper insight into joint error behaviours in high-speed links, extends the range of statistical simulation for coded high-speed links, and provides a case against the use of biased Monte Carlo methods in this setting
Resumo:
Caches are known to consume up to half of all system power in embedded processors. Co-optimizing performance and power of the cache subsystems is therefore an important step in the design of embedded systems, especially those employing application specific instruction processors. In this project, we propose an analytical cache model that succinctly captures the miss performance of an application over the entire cache parameter space. Unlike exhaustive trace driven simulation, our model requires that the program be simulated once so that a few key characteristics can be obtained. Using these application-dependent characteristics, the model can span the entire cache parameter space consisting of cache sizes, associativity and cache block sizes. In our unified model, we are able to cater for direct-mapped, set and fully associative instruction, data and unified caches. Validation against full trace-driven simulations shows that our model has a high degree of fidelity. Finally, we show how the model can be coupled with a power model for caches such that one can very quickly decide on pareto-optimal performance-power design points for rapid design space exploration.
Resumo:
The Hardy-Weinberg law, formulated about 100 years ago, states that under certain assumptions, the three genotypes AA, AB and BB at a bi-allelic locus are expected to occur in the proportions p2, 2pq, and q2 respectively, where p is the allele frequency of A, and q = 1-p. There are many statistical tests being used to check whether empirical marker data obeys the Hardy-Weinberg principle. Among these are the classical xi-square test (with or without continuity correction), the likelihood ratio test, Fisher's Exact test, and exact tests in combination with Monte Carlo and Markov Chain algorithms. Tests for Hardy-Weinberg equilibrium (HWE) are numerical in nature, requiring the computation of a test statistic and a p-value. There is however, ample space for the use of graphics in HWE tests, in particular for the ternary plot. Nowadays, many genetical studies are using genetical markers known as Single Nucleotide Polymorphisms (SNPs). SNP data comes in the form of counts, but from the counts one typically computes genotype frequencies and allele frequencies. These frequencies satisfy the unit-sum constraint, and their analysis therefore falls within the realm of compositional data analysis (Aitchison, 1986). SNPs are usually bi-allelic, which implies that the genotype frequencies can be adequately represented in a ternary plot. Compositions that are in exact HWE describe a parabola in the ternary plot. Compositions for which HWE cannot be rejected in a statistical test are typically “close" to the parabola, whereas compositions that differ significantly from HWE are “far". By rewriting the statistics used to test for HWE in terms of heterozygote frequencies, acceptance regions for HWE can be obtained that can be depicted in the ternary plot. This way, compositions can be tested for HWE purely on the basis of their position in the ternary plot (Graffelman & Morales, 2008). This leads to nice graphical representations where large numbers of SNPs can be tested for HWE in a single graph. Several examples of graphical tests for HWE (implemented in R software), will be shown, using SNP data from different human populations
Resumo:
In this chapter, an asymmetric DSGE model is built in order to account for asymmetries in business cycles. One of the most important contributions of this work is the construction of a general utility function which nests loss aversion, risk aversion and habits formation by means of a smooth transition function. The main idea behind this asymmetric utility function is that under recession the agents over-smooth consumption and leisure choices in order to prevent a huge deviation of them from the reference level of the utility; while under boom, the agents simply smooth consumption and leisure, but trying to be as far as possible from the reference level of utility. The simulations of this model by means of Perturbations Method show that it is possible to reproduce asymmetrical business cycles where recession (on shock) are stronger than booms and booms are more long-lasting than recession. One additional and unexpected result is a downward stickiness displayed by real wages. As a consequence of this, there is a more persistent fall in employment in recession than in boom. Thus, the model reproduces not only asymmetrical business cycles but also real stickiness and hysteresis.
Resumo:
Analizar los procedimientos sistemáticos para la síntesis de resultados; ofrecer alternativas metodológicas a los problemas detectados en el proceso de realización de un meta-análisis; y establecer un conjunto de pautas istemáticas para la realización de revisiones de resultados de investigación. La primera parte presenta la conceptualización del meta-análisis como una perspectiva para la información de resultados. Después se describen y analizan las alternativas metodológicas de integración meta-analítica. Por último se evalúa el funcionamiento de las propuestas metodológicas determinando la adecuación a las características comunes de desarrollo de un estudio meta-analítico. Se utiliza el método analítico-descriptivo y la simulación Monte Carlo, que permite comparar alternativas según criterios objetivos. Se trata de generar conjuntos de datos que respondan a modelos predeterminados. A los datos así generados se les aplica la técnica objeto de estudio y se comprueba su comportamiento en las distintas condiciones experimentales. Se muestra la superioridad de los modelos jerárquicos lineales en la síntesis cuantitativa de la evidencia en el ámbito de las Ciencias Sociales, puesto que sus estimadores están escasamente sesgados, son altamente eficientes, robustos y sus pruebas de contraste muestran potencia por encima de los niveles nominales. La síntesis de resultados responde a la necesidad de racionalizar ante la acumulación de conocimientos fruto del avance científico. De entre las alternativas, el meta-análisis es la herramienta más adecuada para la síntesis cuantitativa. Es un tipo de investigación centrado en el análisis de la generalización de resultados de estudios primarios permitiendo establecer el estado de la investigación en un ámbito concreto y elaborar modelos relacionales. Sus principales problemas son de tipo metodológico y procedimental. La adaptación de métodos estadísticos tradicionales de análisis de varianza y regresión, es un gran avance, pero no son del todo adecuados al meta-análisis. Por tanto, los procedimientos de integración propuestos desde los modelos jerárquicos lineales son una alternativa válida, sencilla y eficaz a los tradicionales procedimientos meta-analíticos de integración de resultados.
Resumo:
Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios
Resumo:
Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach