996 resultados para MCMC sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global diversity curves reflect more than just the number of taxa that have existed through time: they also mirror variation in the nature of the fossil record and the way the record is reported. These sampling effects are best quantified by assembling and analyzing large numbers of locality-specific biotic inventories. Here, we introduce a new database of this kind for the Phanerozoic fossil record of marine invertebrates. We apply four substantially distinct analytical methods that estimate taxonomic diversity by quantifying and correcting for variation through time in the number and nature of inventories. Variation introduced by the use of two dramatically different counting protocols also is explored. We present sampling-standardized diversity estimates for two long intervals that sum to 300 Myr (Middle Ordovician-Carboniferous; Late Jurassic-Paleogene). Our new curves differ considerably from traditional, synoptic curves. For example, some of them imply unexpectedly low late Cretaceous and early Tertiary diversity levels. However, such factors as the current emphasis in the database on North America and Europe still obscure our view of the global history of marine biodiversity. These limitations will be addressed as the database and methods are refined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we determine the extent to which host-mediated mutations and a known sampling bias affect evolutionary studies of human influenza A. Previous phylogenetic reconstruction of influenza A (H3N2) evolution using the hemagglutinin gene revealed an excess of nonsilent substitutions assigned to the terminal branches of the tree. We investigate two hypotheses to explain this observation. The first hypothesis is that the excess reflects mutations that were either not present or were at low frequency in the viral sample isolated from its human host, and that these mutations increased in frequency during passage of the virus in embryonated eggs. A set of 22 codons known to undergo such “host-mediated” mutations showed a significant excess of mutations assigned to branches attaching sequences from egg-cultured (as opposed to cell-cultured) isolates to the tree. Our second hypothesis is that the remaining excess results from sampling bias. Influenza surveillance is purposefully biased toward sequencing antigenically dissimilar strains in an effort to identify new variants that may signal the need to update the vaccine. This bias produces an excess of mutations assigned to terminal branches simply because an isolate with no close relatives is by definition attached to the tree by a relatively long branch. Simulations show that the magnitude of excess mutations we observed in the hemagglutinin tree is consistent with expectations based on our sampling protocol. Sampling bias does not affect inferences about evolution drawn from phylogenetic analyses. However, if possible, the excess caused by host-mediated mutations should be removed from studies of the evolution of influenza viruses as they replicate in their human hosts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Monte Carlo simulation method for globular proteins, called extended-scaled-collective-variable (ESCV) Monte Carlo, is proposed. This method combines two Monte Carlo algorithms known as entropy-sampling and scaled-collective-variable algorithms. Entropy-sampling Monte Carlo is able to sample a large configurational space even in a disordered system that has a large number of potential barriers. In contrast, scaled-collective-variable Monte Carlo provides an efficient sampling for a system whose dynamics is highly cooperative. Because a globular protein is a disordered system whose dynamics is characterized by collective motions, a combination of these two algorithms could provide an optimal Monte Carlo simulation for a globular protein. As a test case, we have carried out an ESCV Monte Carlo simulation for a cell adhesive Arg-Gly-Asp-containing peptide, Lys-Arg-Cys-Arg-Gly-Asp-Cys-Met-Asp, and determined the conformational distribution at 300 K. The peptide contains a disulfide bridge between the two cysteine residues. This bond mimics the strong geometrical constraints that result from a protein's globular nature and give rise to highly cooperative dynamics. Computation results show that the ESCV Monte Carlo was not trapped at any local minimum and that the canonical distribution was correctly determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Correlations in low-frequency atomic displacements predicted by molecular dynamics simulations on the order of 1 ns are undersampled for the time scales currently accessible by the technique. This is shown with three different representations of the fluctuations in a macromolecule: the reciprocal space of crystallography using diffuse x-ray scattering data, real three-dimensional Cartesian space using covariance matrices of the atomic displacements, and the 3N-dimensional configuration space of the protein using dimensionally reduced projections to visualize the extent to which phase space is sampled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that quantum correlations for bipartite dichotomic measurements are those of the form (Formula presented.), where the vectors ui and vj are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of (Formula presented.), where the previous vectors are sampled according to the Haar measure in the unit sphere of (Formula presented.). In particular, we prove the existence of an (Formula presented.) such that if (Formula presented.), (Formula presented.) is nonlocal with probability tending to 1 as (Formula presented.), while for (Formula presented.), (Formula presented.) is local with probability tending to 1 as (Formula presented.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Em testes nos quais uma quantidade considerável de indivíduos não dispõe de tempo suciente para responder todos os itens temos o que é chamado de efeito de Speededness. O uso do modelo unidimensional da Teoria da Resposta ao Item (TRI) em testes com speededness pode nos levar a uma série de interpretações errôneas uma vez que nesse modelo é suposto que os respondentes possuem tempo suciente para responder todos os itens. Nesse trabalho, desenvolvemos uma análise Bayesiana do modelo tri-dimensional da TRI proposto por Wollack e Cohen (2005) considerando uma estrutura de dependência entre as distribuições a priori dos traços latentes a qual modelamos com o uso de cópulas. Apresentamos um processo de estimação para o modelo proposto e fazemos um estudo de simulação comparativo com a análise realizada por Bazan et al. (2010) na qual foi utilizada distribuições a priori independentes para os traços latentes. Finalmente, fazemos uma análise de sensibilidade do modelo em estudo e apresentamos uma aplicação levando em conta um conjunto de dados reais proveniente de um subteste do EGRA, chamado de Nonsense Words, realizado no Peru em 2007. Nesse subteste os alunos são avaliados por via oral efetuando a leitura, sequencialmente, de 50 palavras sem sentidos em 60 segundos o que caracteriza a presença do efeito speededness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an algorithm to process images of reflected Placido rings captured by a commercial videokeratoscope. Raw data are obtained with no Cartesian-to-polar-coordinate conversion, thus avoiding interpolation and associated numerical artifacts. The method provides a characteristic equation for the device and is able to process around 6 times more corneal data than the commercial software. Our proposal allows complete control over the whole process from the capture of corneal images until the computation of curvature radii.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This correspondence presents an efficient method for reconstructing a band-limited signal in the discrete domain from its crossings with a sine wave. The method makes it possible to design A/D converters that only deliver the crossing timings, which are then used to interpolate the input signal at arbitrary instants. Potentially, it may allow for reductions in power consumption and complexity in these converters. The reconstruction in the discrete domain is based on a recently-proposed modification of the Lagrange interpolator, which is readily implementable with linear complexity and efficiently, given that it re-uses known schemes for variable fractional-delay (VFD) filters. As a spin-off, the method allows one to perform spectral analysis from sine wave crossings with the complexity of the FFT. Finally, the results in the correspondence are validated in several numerical examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light traps have been used widely to sample insect abundance and diversity, but their performance for sampling scarab beetles in tropical forests based on light source type and sampling hours throughout the night has not been evaluated. The efficiency of mercury-vapour lamps, cool white light and ultraviolet light sources in attracting Dynastinae, Melolonthinae and Rutelinae scarab beetles, and the most adequate period of the night to carry out the sampling was tested in different forest areas of Costa Rica. Our results showed that light source wavelengths and hours of sampling influenced scarab beetle catches. No significant differences were observed in trap performance between the ultraviolet light and mercury-vapour traps, whereas these two methods caught significantly more species richness and abundance than cool white light traps. Species composition also varied between methods. Large differences appear between catches in the sampling period, with the first five hours of the night being more effective than the last five hours. Because of their high efficiency and logistic advantages, we recommend ultraviolet light traps deployed during the first hours of the night as the best sampling method for biodiversity studies of those scarab beetles in tropical forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The choice of sampling methods to survey saproxylic beetles is a key aspect to assessing conservation strategies for one of the most endangered assemblages in Europe. We evaluated the efficiency of three sampling methods: baited tube traps (TT), window traps in front of a hollow opening (WT), and emergence traps covering tree hollows (ET) to study richness and diversity of saproxylic beetle assemblages at species and family levels in Mediterranean woodlands. We also examined trap efficiency to report ecological diversity, and changes in the relative richness and abundance of species forming trophic guilds: xylophagous, saprophagous/saproxylophagous, xylomycetophagous, predators and commensals. WT and ET were similarly effective in reporting species richness and diversity at species and family levels, and provided an accurate profile of both the flying active and hollow-linked saproxylic beetle assemblages. WT and ET were the most complementary methods, together reporting more than 90 % of richness and diversity at both species and family levels. Diversity, richness and abundance of guilds were better characterized by ET, which indicates higher efficiency in outlining the ecological community of saproxylics that inhabit tree hollows. TT were the least effective method at both taxonomic levels, sampling a biased portion of the beetle assemblage attracted to trapping principles, however they could be used as a specific method for families such as Bostrichiidae, Biphyllidae, Melyridae, Mycetophagidae or Curculionidae Scolytinae species. Finally, ET and WT combination allows a better characterization of saproxylic assemblages in Mediterranean woodland, by recording species with different biology and linked to different microhabitat types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling may promote prolonged engagement in sport by limiting physical injuries (Fraser-Thomas et al., 2005). Overtraining injuries are a concern for young athletes who specialize in one sport and engage in high volumes of deliberate practice (Hollander, Meyers, & Leunes, 1995; Law, Côté, & Ericsson, 2007). For instance, young gymnasts who practice for over 16 hours a week have been shown to have higher incidences of back injuries (Goldstein, Berger, Windier, & Jackson, 1991). A sampling approach in child-controlled play (e.g. deliberate play) rather than highly adult-controlled practice (e.g. deliberate practice) has been proposed as a strategy to limit overuse and other sport-related injuries (Micheli, Glassman, & Klein, 2000). In summary, sampling may protect against sport attrition by limiting sport related injuries and allowing children to have early experiences in sport that are enjoyable. Psychosocial Benefits of Sampling Only a small percentage of children who participate in school sports ever become elite athletes. Therefore, the psychosocial outcomes of sport participation are particularly important to consider. Recent studies with youth between the ages of 11 to 17 have found that those who are involved in a variety of extracurricular activities (e.g. sports, volunteer, arts) score more favourably on outcome measures such as Grade Point Average (GPA; Fredricks & Eccles, 2006a) and positive peer relationships (Fredricks & Eccles, 2006b) than youth who participate in fewer activities. These patterns are thought to exist due to each extracurricular activity bringing its own distinct pattern of socialization experiences that reinforce certain behaviours and/or teach various skills (Fredricks & Eccles, 2006b; Rose-Krasnor, Bussen, Willoughby, & Chambers, 2006). This contention is corroborated by studies of children and youths' experiences in extracurricular activities indicating that youth have unique experiences in each activity that contribute to their development (Hansen, Larson, & Dworkin, 2003; Larson, Hansen, & Moneta, 2006). This has led Wilkes and Côté (2007) to propose that children who sample different activities (through their own choice or by virtue of parental direction), have a greater chance of developing the following five developmental outcomes compared to children who specialize in one activity: 1) life skills, 2) prosocial behaviour, 3) healthy identity, 4) diverse peer groups and 5) social capital.