975 resultados para markov chains monte carlo methods
Resumo:
A Monte Carlo simulation study of the vacancy-assisted domain growth in asymmetric binary alloys is presented. The system is modeled using a three-state ABV Hamiltonian which includes an asymmetry term. Our simulated system is a stoichiometric two-dimensional binary alloy with a single vacancy which evolves according to the vacancy-atom exchange mechanism. We obtain that, compared to the symmetric case, the ordering process slows down dramatically. Concerning the asymptotic behavior it is algebraic and characterized by the Allen-Cahn growth exponent x51/2. The late stages of the evolution are preceded by a transient regime strongly affected by both the temperature and the degree of asymmetry of the alloy. The results are discussed and compared to those obtained for the symmetric case.
Resumo:
ic first-order transition line ending in a critical point. This critical point is responsible for the existence of large premartensitic fluctuations which manifest as broad peaks in the specific heat, not always associated with a true phase transition. The main conclusion is that premartensitic effects result from the interplay between the softness of the anomalous phonon driving the modulation and the magnetoelastic coupling. In particular, the premartensitic transition occurs when such coupling is strong enough to freeze the involved mode phonon. The implication of the results in relation to the available experimental data is discussed.
Resumo:
We study the analytical solution of the Monte Carlo dynamics in the spherical Sherrington-Kirkpatrick model using the technique of the generating function. Explicit solutions for one-time observables (like the energy) and two-time observables (like the correlation and response function) are obtained. We show that the crucial quantity which governs the dynamics is the acceptance rate. At zero temperature, an adiabatic approximation reveals that the relaxational behavior of the model corresponds to that of a single harmonic oscillator with an effective renormalized mass.
Resumo:
Sensing with electromagnetic waves having frequencies in the Terahertz-range is a very attractive investigative method with applications in fundamental research and industrial settings. Up to now, a lot of sources and detectors are available. However, most of these systems are bulky and have to be used in controllable environments such as laboratories. In 1993 Dyakonov and Shur suggested that plasma waves developing in field-effect-transistors can be used to emit and detect THz-radiation. Later on, it was shown that these plasma waves lead to rectification and allows for building efficient detectors. In contrast to the prediction that these plasma waves lead to new promising solid-state sources, only a few weak sources are known up to now. This work studies THz plasma waves in semiconductor devices using the Monte Carlo method in order to resolve this issue. A fast Monte Carlo solver was developed implementing a nonparabolic bandstructure representation of the used semiconductors. By investigating simplified field-effect-transistors it was found that the plasma frequency follows under equilibrium conditions the analytical predictions. However, no current oscillations were found at room temperature or with a current flowing in the channel. For more complex structures, consisting of ungated and gated regions, it was found that the plasma frequency does not follow the value predicted by the dispersion relation of the gated nor the ungated device.
Resumo:
El proyecto de investigación parte de la dinámica del modelo de distribución tercerizada para una compañía de consumo masivo en Colombia, especializada en lácteos, que para este estudio se ha denominado “Lactosa”. Mediante datos de panel con estudio de caso, se construyen dos modelos de demanda por categoría de producto y distribuidor y mediante simulación estocástica, se identifican las variables relevantes que inciden sus estructuras de costos. El problema se modela a partir del estado de resultados por cada uno de los cuatro distribuidores analizados en la región central del país. Se analiza la estructura de costos y el comportamiento de ventas dado un margen (%) de distribución logístico, en función de las variables independientes relevantes, y referidas al negocio, al mercado y al entorno macroeconómico, descritas en el objeto de estudio. Entre otros hallazgos, se destacan brechas notorias en los costos de distribución y costos en la fuerza de ventas, pese a la homogeneidad de segmentos. Identifica generadores de valor y costos de mayor dispersión individual y sugiere uniones estratégicas de algunos grupos de distribuidores. La modelación con datos de panel, identifica las variables relevantes de gestión que inciden sobre el volumen de ventas por categoría y distribuidor, que focaliza los esfuerzos de la dirección. Se recomienda disminuir brechas y promover desde el productor estrategias focalizadas a la estandarización de procesos internos de los distribuidores; promover y replicar los modelos de análisis, sin pretender remplazar conocimiento de expertos. La construcción de escenarios fortalece de manera conjunta y segura la posición competitiva de la compañía y sus distribuidores.
Resumo:
Analizar los procedimientos sistemáticos para la síntesis de resultados; ofrecer alternativas metodológicas a los problemas detectados en el proceso de realización de un meta-análisis; y establecer un conjunto de pautas istemáticas para la realización de revisiones de resultados de investigación. La primera parte presenta la conceptualización del meta-análisis como una perspectiva para la información de resultados. Después se describen y analizan las alternativas metodológicas de integración meta-analítica. Por último se evalúa el funcionamiento de las propuestas metodológicas determinando la adecuación a las características comunes de desarrollo de un estudio meta-analítico. Se utiliza el método analítico-descriptivo y la simulación Monte Carlo, que permite comparar alternativas según criterios objetivos. Se trata de generar conjuntos de datos que respondan a modelos predeterminados. A los datos así generados se les aplica la técnica objeto de estudio y se comprueba su comportamiento en las distintas condiciones experimentales. Se muestra la superioridad de los modelos jerárquicos lineales en la síntesis cuantitativa de la evidencia en el ámbito de las Ciencias Sociales, puesto que sus estimadores están escasamente sesgados, son altamente eficientes, robustos y sus pruebas de contraste muestran potencia por encima de los niveles nominales. La síntesis de resultados responde a la necesidad de racionalizar ante la acumulación de conocimientos fruto del avance científico. De entre las alternativas, el meta-análisis es la herramienta más adecuada para la síntesis cuantitativa. Es un tipo de investigación centrado en el análisis de la generalización de resultados de estudios primarios permitiendo establecer el estado de la investigación en un ámbito concreto y elaborar modelos relacionales. Sus principales problemas son de tipo metodológico y procedimental. La adaptación de métodos estadísticos tradicionales de análisis de varianza y regresión, es un gran avance, pero no son del todo adecuados al meta-análisis. Por tanto, los procedimientos de integración propuestos desde los modelos jerárquicos lineales son una alternativa válida, sencilla y eficaz a los tradicionales procedimientos meta-analíticos de integración de resultados.
Resumo:
Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach
Resumo:
A partial phase diagram is constructed for diblock copolymer melts using lattice-based Monte Carlo simulations. This is done by locating the order-disorder transition (ODT) with the aid of a recently proposed order parameter and identifying the ordered phase over a wide range of copolymer compositions (0.2 <= f <= 0.8). Consistent with experiments, the disordered phase is found to exhibit direct first-order transitions to each of the ordered morphologies. This includes the spontaneous formation of a perforated-lamellar phase, which presumably forms in place of the gyroid morphology due to finite-size and/or nonequilibrium effects. Also included in our study is a detailed examination of disordered cylinder-forming (f=0.3) diblock copolymers, revealing a substantial degree of pretransitional chain stretching and short-range order that set in well before the ODT, as observed previously in analogous studies on lamellar-forming (f=0.5) molecules. (c) 2006 American Institute of Physics.
Resumo:
The phase diagram for diblock copolymer melts is evaluated from lattice-based Monte Carlo simulations using parallel tempering, improving upon earlier simulations that used sequential temperature scans. This new approach locates the order-disorder transition (ODT) far more accurately by the occurrence of a sharp spike in the heat capacity. The present study also performs a more thorough investigation of finite-size effects, which reveals that the gyroid (G) morphology spontaneously forms in place of the perforated-lamellar (PL) phase identified in the earlier study. Nevertheless, there still remains a small region where the PL phase appears to be stable. Interestingly, the lamellar (L) phase next to this region exhibits a small population of transient perforations, which may explain previous scattering experiments suggesting a modulated-lamellar (ML) phase.
Resumo:
The steadily accumulating literature on technical efficiency in fisheries attests to the importance of efficiency as an indicator of fleet condition and as an object of management concern. In this paper, we extend previous work by presenting a Bayesian hierarchical approach that yields both efficiency estimates and, as a byproduct of the estimation algorithm, probabilistic rankings of the relative technical efficiencies of fishing boats. The estimation algorithm is based on recent advances in Markov Chain Monte Carlo (MCMC) methods—Gibbs sampling, in particular—which have not been widely used in fisheries economics. We apply the method to a sample of 10,865 boat trips in the US Pacific hake (or whiting) fishery during 1987–2003. We uncover systematic differences between efficiency rankings based on sample mean efficiency estimates and those that exploit the full posterior distributions of boat efficiencies to estimate the probability that a given boat has the highest true mean efficiency.
Resumo:
Micromorphological characters of the fruiting bodies, such as ascus-type and hymenial amyloidity, and secondary chemistry have been widely employed as key characters in Ascomycota classification. However, the evolution of these characters has yet not been studied using molecular phylogenies. We have used a combined Bayesian and maximum likelihood based approach to trace character evolution on a tree inferred from a combined analysis of nuclear and mitochondrial ribosomal DNA sequences. The maximum likelihood aspect overcomes simplifications inherent in maximum parsimony methods, whereas the Markov chain Monte Carlo aspect renders results independent of any particular phylogenetic tree. The results indicate that the evolution of the two chemical characters is quite different, being stable once developed for the medullary lecanoric acid, whereas the cortical chlorinated xanthones appear to have been lost several times. The current ascus-types and the amyloidity of the hymenial gel in Pertusariaceae appear to have been developed within the family. The basal ascus-type of pertusarialean fungi remains unknown. (c) 2006 The Linnean Society of London, Biological Journal of the Linnean Society, 2006, 89, 615-626.
Resumo:
The identification of signatures of natural selection in genomic surveys has become an area of intense research, stimulated by the increasing ease with which genetic markers can be typed. Loci identified as subject to selection may be functionally important, and hence (weak) candidates for involvement in disease causation. They can also be useful in determining the adaptive differentiation of populations, and exploring hypotheses about speciation. Adaptive differentiation has traditionally been identified from differences in allele frequencies among different populations, summarised by an estimate of F-ST. Low outliers relative to an appropriate neutral population-genetics model indicate loci subject to balancing selection, whereas high outliers suggest adaptive (directional) selection. However, the problem of identifying statistically significant departures from neutrality is complicated by confounding effects on the distribution of F-ST estimates, and current methods have not yet been tested in large-scale simulation experiments. Here, we simulate data from a structured population at many unlinked, diallelic loci that are predominantly neutral but with some loci subject to adaptive or balancing selection. We develop a hierarchical-Bayesian method, implemented via Markov chain Monte Carlo (MCMC), and assess its performance in distinguishing the loci simulated under selection from the neutral loci. We also compare this performance with that of a frequentist method, based on moment-based estimates of F-ST. We find that both methods can identify loci subject to adaptive selection when the selection coefficient is at least five times the migration rate. Neither method could reliably distinguish loci under balancing selection in our simulations, even when the selection coefficient is twenty times the migration rate.