77 resultados para MONTE-CARLO METHODS


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Micromorphological characters of the fruiting bodies, such as ascus-type and hymenial amyloidity, and secondary chemistry have been widely employed as key characters in Ascomycota classification. However, the evolution of these characters has yet not been studied using molecular phylogenies. We have used a combined Bayesian and maximum likelihood based approach to trace character evolution on a tree inferred from a combined analysis of nuclear and mitochondrial ribosomal DNA sequences. The maximum likelihood aspect overcomes simplifications inherent in maximum parsimony methods, whereas the Markov chain Monte Carlo aspect renders results independent of any particular phylogenetic tree. The results indicate that the evolution of the two chemical characters is quite different, being stable once developed for the medullary lecanoric acid, whereas the cortical chlorinated xanthones appear to have been lost several times. The current ascus-types and the amyloidity of the hymenial gel in Pertusariaceae appear to have been developed within the family. The basal ascus-type of pertusarialean fungi remains unknown. (c) 2006 The Linnean Society of London, Biological Journal of the Linnean Society, 2006, 89, 615-626.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The identification of signatures of natural selection in genomic surveys has become an area of intense research, stimulated by the increasing ease with which genetic markers can be typed. Loci identified as subject to selection may be functionally important, and hence (weak) candidates for involvement in disease causation. They can also be useful in determining the adaptive differentiation of populations, and exploring hypotheses about speciation. Adaptive differentiation has traditionally been identified from differences in allele frequencies among different populations, summarised by an estimate of F-ST. Low outliers relative to an appropriate neutral population-genetics model indicate loci subject to balancing selection, whereas high outliers suggest adaptive (directional) selection. However, the problem of identifying statistically significant departures from neutrality is complicated by confounding effects on the distribution of F-ST estimates, and current methods have not yet been tested in large-scale simulation experiments. Here, we simulate data from a structured population at many unlinked, diallelic loci that are predominantly neutral but with some loci subject to adaptive or balancing selection. We develop a hierarchical-Bayesian method, implemented via Markov chain Monte Carlo (MCMC), and assess its performance in distinguishing the loci simulated under selection from the neutral loci. We also compare this performance with that of a frequentist method, based on moment-based estimates of F-ST. We find that both methods can identify loci subject to adaptive selection when the selection coefficient is at least five times the migration rate. Neither method could reliably distinguish loci under balancing selection in our simulations, even when the selection coefficient is twenty times the migration rate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Biologists frequently attempt to infer the character states at ancestral nodes of a phylogeny from the distribution of traits observed in contemporary organisms. Because phylogenies are normally inferences from data, it is desirable to account for the uncertainty in estimates of the tree and its branch lengths when making inferences about ancestral states or other comparative parameters. Here we present a general Bayesian approach for testing comparative hypotheses across statistically justified samples of phylogenies, focusing on the specific issue of reconstructing ancestral states. The method uses Markov chain Monte Carlo techniques for sampling phylogenetic trees and for investigating the parameters of a statistical model of trait evolution. We describe how to combine information about the uncertainty of the phylogeny with uncertainty in the estimate of the ancestral state. Our approach does not constrain the sample of trees only to those that contain the ancestral node or nodes of interest, and we show how to reconstruct ancestral states of uncertain nodes using a most-recent-common-ancestor approach. We illustrate the methods with data on ribonuclease evolution in the Artiodactyla. Software implementing the methods ( BayesMultiState) is available from the authors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present argon predissociation vibrational spectra of the OH-.H2O and Cl-.H2O complexes in the 1000-1900 cm(-1) energy range, far below the OH stretching region reported in previous studies. This extension allows us to explore the fundamental transitions of the intramolecular bending vibrations associated with the water molecule, as well as that of the shared proton inferred from previous assignments of overtones in the higher energy region. Although the water bending fundamental in the Cl-.H2O spectrum is in very good agreement with expectations, the OH-.H2O spectrum is quite different than anticipated, being dominated by a strong feature at 1090 cm(-1). New full-diniensionality calculations of the OH-.H2O vibrational level structure using diffusion Monte Carlo and the VSCF/CI methods indicate this band arises from excitation of the shared proton.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The vibrations of H3O2- and D3O2- are investigated using diffusion Monte Carlo (DMC) and vibrational configuration-interaction approaches, as implemented in the program MULTIMODE. These studies use the potential surface recently developed by Huang [ J. Am. Chem. Soc. 126, 5042 (2004)]. The focus of this work is on the vibrational ground state and fundamentals which occur between 100 and 3700 cm(-1). In most cases, excellent agreement is obtained between the fundamental frequencies calculated by the two approaches. This serves to demonstrate the power of both methods for treating this very anharmonic system. Based on the results of the MULTIMODE and DMC treatments, the extent and nature of the couplings in H3O2- and D3O2- are investigated. (C) 2005 American Institute of Physics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 a, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Determination of the local structure of a polymer glass by scattering methods is complex due to the number of spatial and orientational correlations, both from within the polymer chain (intrachain) and between neighbouring chains (interchain), from which the scattering arises. Recently considerable advances have been made in the structural analysis of relatively simple polymers such as poly(ethylene) through the use of broad Q neutron scattering data tightly coupled to atomistic modelling procedures. This paper presents the results of an investigation into the use of these procedures for the analysis of the local structure of a-PMMA which is chemically more complex with a much greater number of intrachain structural parameters. We have utilised high quality neutron scattering data obtained using SANDALS at ISIS coupled with computer models representing both the single chain and bulk polymer system. Several different modelling approaches have been explored which encompass such techniques as Reverse Monte Carlo refinement and energy minimisation and their relative merits and successes are discussed. These different approaches highlight structural parameters which any realistic model of glassy atactic PMMA must replicate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present, pedagogically, the Bayesian approach to composed error models under alternative, hierarchical characterizations; demonstrate, briefly, the Bayesian approach to model comparison using recent advances in Markov Chain Monte Carlo (MCMC) methods; and illustrate, empirically, the value of these techniques to natural resource economics and coastal fisheries management, in particular. The Bayesian approach to fisheries efficiency analysis is interesting for at least three reasons. First, it is a robust and highly flexible alternative to commonly applied, frequentist procedures, which dominate the literature. Second,the Bayesian approach is extremely simple to implement, requiring only a modest addition to most natural-resource economist tool-kits. Third, despite its attractions, applications of Bayesian methodology in coastal fisheries management are few.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The steadily accumulating literature on technical efficiency in fisheries attests to the importance of efficiency as an indicator of fleet condition and as an object of management concern. In this paper, we extend previous work by presenting a Bayesian hierarchical approach that yields both efficiency estimates and, as a byproduct of the estimation algorithm, probabilistic rankings of the relative technical efficiencies of fishing boats. The estimation algorithm is based on recent advances in Markov Chain Monte Carlo (MCMC) methods— Gibbs sampling, in particular—which have not been widely used in fisheries economics. We apply the method to a sample of 10,865 boat trips in the US Pacific hake (or whiting) fishery during 1987–2003. We uncover systematic differences between efficiency rankings based on sample mean efficiency estimates and those that exploit the full posterior distributions of boat efficiencies to estimate the probability that a given boat has the highest true mean efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Homeric epics are among the greatest masterpieces of literature, but when they were produced is not known with certainty. Here we apply evolutionary-linguistic phylogenetic statistical methods to differences in Homeric, Modern Greek and ancient Hittite vocabulary items to estimate a date of approximately 710–760 BCE for these great works. Our analysis compared a common set of vocabulary items among the three pairs of languages, recording for each item whether the words in the two languages were cognate – derived from a shared ancestral word – or not. We then used a likelihood-based Markov chain Monte Carlo procedure to estimate the most probable times in years separating these languages given the percentage of words they shared, combined with knowledge of the rates at which different words change. Our date for the epics is in close agreement with historians' and classicists' beliefs derived from historical and archaeological sources.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The hybrid Monte Carlo (HMC) method is a popular and rigorous method for sampling from a canonical ensemble. The HMC method is based on classical molecular dynamics simulations combined with a Metropolis acceptance criterion and a momentum resampling step. While the HMC method completely resamples the momentum after each Monte Carlo step, the generalized hybrid Monte Carlo (GHMC) method can be implemented with a partial momentum refreshment step. This property seems desirable for keeping some of the dynamic information throughout the sampling process similar to stochastic Langevin and Brownian dynamics simulations. It is, however, ultimate to the success of the GHMC method that the rejection rate in the molecular dynamics part is kept at a minimum. Otherwise an undesirable Zitterbewegung in the Monte Carlo samples is observed. In this paper, we describe a method to achieve very low rejection rates by using a modified energy, which is preserved to high-order along molecular dynamics trajectories. The modified energy is based on backward error results for symplectic time-stepping methods. The proposed generalized shadow hybrid Monte Carlo (GSHMC) method is applicable to NVT as well as NPT ensemble simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Since the advent of wide-angle imaging of the inner heliosphere, a plethora of techniques have been developed to investigate the three-dimensional structure and kinematics of solar wind transients, such as coronal mass ejections, from their signatures in single- and multi-spacecraft imaging observations. These techniques, which range from the highly complex and computationally intensive to methods based on simple curve fitting, all have their inherent advantages and limitations. In the analysis of single-spacecraft imaging observations, much use has been made of the fixed φ fitting (FPF) and harmonic mean fitting (HMF) techniques, in which the solar wind transient is considered to be a radially propagating point source (fixed φ, FP, model) and a radially expanding circle anchored at Sun centre (harmonic mean, HM, model), respectively. Initially, we compare the radial speeds and propagation directions derived from application of the FPF and HMF techniques to a large set of STEREO/Heliospheric Imager (HI) observations. As the geometries on which these two techniques are founded constitute extreme descriptions of solar wind transients in terms of their extent along the line of sight, we describe a single-spacecraft fitting technique based on a more generalized model for which the FP and HM geometries form the limiting cases. In addition to providing estimates of a transient’s speed and propagation direction, the self-similar expansion fitting (SSEF) technique provides, in theory, the capability to estimate the transient’s angular extent in the plane orthogonal to the field of view. Using the HI observations, and also by performing a Monte Carlo simulation, we assess the potential of the SSEF technique.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The European Centre for Medium-range Weather Forecast (ECMWF) provides an aerosol re-analysis starting from year 2003 for the Monitoring Atmospheric Composition and Climate (MACC) project. The re-analysis assimilates total aerosol optical depth retrieved by the Moderate Resolution Imaging Spectroradiometer (MODIS) to correct for model departures from observed aerosols. The reanalysis therefore combines satellite retrievals with the full spatial coverage of a numerical model. Re-analysed products are used here to estimate the shortwave direct and first indirect radiative forcing of anthropogenic aerosols over the period 2003–2010, using methods previously applied to satellite retrievals of aerosols and clouds. The best estimate of globally-averaged, all-sky direct radiative forcing is −0.7±0.3Wm−2. The standard deviation is obtained by a Monte-Carlo analysis of uncertainties, which accounts for uncertainties in the aerosol anthropogenic fraction, aerosol absorption, and cloudy-sky effects. Further accounting for differences between the present-day natural and pre-industrial aerosols provides a direct radiative forcing estimate of −0.4±0.3Wm−2. The best estimate of globally-averaged, all-sky first indirect radiative forcing is −0.6±0.4Wm−2. Its standard deviation accounts for uncertainties in the aerosol anthropogenic fraction, and in cloud albedo and cloud droplet number concentration susceptibilities to aerosol changes. The distribution of first indirect radiative forcing is asymmetric and is bounded by −0.1 and −2.0Wm−2. In order to decrease uncertainty ranges, better observational constraints on aerosol absorption and sensitivity of cloud droplet number concentrations to aerosol changes are required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We compare linear autoregressive (AR) models and self-exciting threshold autoregressive (SETAR) models in terms of their point forecast performance, and their ability to characterize the uncertainty surrounding those forecasts, i.e. interval or density forecasts. A two-regime SETAR process is used as the data-generating process in an extensive set of Monte Carlo simulations, and we consider the discriminatory power of recently developed methods of forecast evaluation for different degrees of non-linearity. We find that the interval and density evaluation methods are unlikely to show the linear model to be deficient on samples of the size typical for macroeconomic data