94 resultados para markov chains monte carlo methods
Resumo:
Social organization is an important component of the population biology of a species that influences gene flow, the spatial pattern and scale of movements, and the effects of predation or exploitation by humans. An important element of social structure in mammals is group fidelity, which can be quantified through association indices. To describe the social organization of marine tucuxi dolphins (Sotalia guianensis) found in the Cananeia estuary, southeastern Brazil, association indices were applied to photo-identification data to characterize the temporal stability of relationships among members of this population. Eighty-seven days of fieldwork were conducted from May 2000 to July 2003, resulting in direct observations of 374 distinct groups. A total of 138 dolphins were identified on 1-38 distinct field days. Lone dolphins were rarely seen, whereas groups were composed of up to 60 individuals (mean +/- 1 SD = 12.4 +/- 11.4 individuals per group). A total of 29,327 photographs were analyzed, of which 6,312 (21.5%) were considered useful for identifying individuals. Half-weight and simple ratio indices were used to investigate associations among S. guianensis as revealed by the entire data set, data from the core study site, and data from groups composed of <= 10 individuals. Monte Carlo methods indicated that only 3 (9.3%) of 32 association matrices differed significantly from expectations based on random association. Thus, our study suggests that stable associations are not characteristic of S. guianensis in the Cananeia estuary.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Foi utilizada uma análise de segregação com o uso da inferência Bayesiana para estimar componentes de variância e verificar a presença de genes de efeito principal (GEP) influenciando duas características de carcaça: gordura intramuscular (GIM), em %, e espessura de toucinho (ET), em mm; e uma de crescimento, ganho de peso (g/dia) dos 25 aos 90 kg de peso vivo (GP). Para este estudo, foram utilizadas informações de 1.257 animais provenientes de um delineamento de F2, obtidos do cruzamento de suínos machos Meishan e fêmeas Large White e Landrace. No melhoramento genético animal, os modelos poligênicos finitos (MPF) podem ser uma alternativa aos modelos poligênicos infinitesimais (MPI) para avaliação genética de características quantitativas usando pedigrees complexos. MPI, MPF e MPI combinado com MPF foram empiricamente testados para se estimar componentes de variâncias e número de genes no MPF. Para a estimação de médias marginais a posteriori de componentes de variância e de parâmetros, foi utilizada uma metodologia Bayesiana, por meio do uso da Cadeia de Markov, algoritmos de Monte Carlo (MCMC), via Amostrador de Gibbs e Reversible Jump Sampler (Metropolis-Hastings). em função dos resultados obtidos, pode-se evidenciar quatro GEP, sendo dois para GIM e dois para ET. Para ET, o GEP explicou a maior parte da variação genética, enquanto, para GIM, o GEP reduziu significativamente a variação poligênica. Para a variação do GP, não foi possível determinar a influência do GEP. As herdabilidades estimadas ajustando-se MPI para GIM, ET e GP foram de 0,37; 0,24 e 0,37, respectivamente. Estudos futuros com base neste experimento que usem marcadores moleculares para mapear os genes de efeito principal que afetem, principalmente GIM e ET, poderão lograr êxito.
Resumo:
The rural-urban migration phenomenon is analyzed by using an agent-based computational model. Agents are placed on lattices which dimensions varying from d = 2 up to d = 7. The localization of the agents in the lattice defines that their social neighborhood (rural or urban) is not related to their spatial distribution. The effect of the dimension of lattice is studied by analyzing the variation of the main parameters that characterizes the migratory process. The dynamics displays strong effects even for around one million of sites, in higher dimensions (d = 6, 7).
Resumo:
Proton computerized tomography deals with relatively thick targets like the human head or trunk. In this case precise analytical calculation of the proton final energy is a rather complicated task, thus the Monte Carlo simulation stands out as a solution. We used the GEANT4.8.2 code to calculate the proton final energy spectra after passing a thick Al absorber and compared it with the same conditions of the experimental data. The ICRU49, Ziegler85 and Ziegler2000 models from the low energy extension pack were used. The results were also compared with the SRIM2008 and MCNPX2.4 simulations, and with solutions of the Boltzmann transport equation in the Fokker-Planck approximation. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We analyze the average performance of a general class of learning algorithms for the nondeterministic polynomial time complete problem of rule extraction by a binary perceptron. The examples are generated by a rule implemented by a teacher network of similar architecture. A variational approach is used in trying to identify the potential energy that leads to the largest generalization in the thermodynamic limit. We restrict our search to algorithms that always satisfy the binary constraints. A replica symmetric ansatz leads to a learning algorithm which presents a phase transition in violation of an information theoretical bound. Stability analysis shows that this is due to a failure of the replica symmetric ansatz and the first step of replica symmetry breaking (RSB) is studied. The variational method does not determine a unique potential but it allows construction of a class with a unique minimum within each first order valley. Members of this class improve on the performance of Gibbs algorithm but fail to reach the Bayesian limit in the low generalization phase. They even fail to reach the performance of the best binary, an optimal clipping of the barycenter of version space. We find a trade-off between a good low performance and early onset of perfect generalization. Although the RSB may be locally stable we discuss the possibility that it fails to be the correct saddle point globally. ©2000 The American Physical Society.
Resumo:
A nonthermal quantum mechanical statistical fragmentation model based on tunneling of particles through potential barriers is studied in compact two- and three-dimensional systems. It is shown that this fragmentation dynamics gives origin to several static and dynamic scaling relations. The critical exponents are found and compared with those obtained in classical statistical models of fragmentation of general interest, in particular with thermal fragmentation involving classical processes over potential barriers. Besides its general theoretical interest, the fragmentation dynamics discussed here is complementary to classical fragmentation dynamics of interest in chemical kinetics and can be useful in the study of a number of other dynamic processes such as nuclear fragmentation. ©2000 The American Physical Society.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A measurement technique of charm baryons lifetimes from hadro-production data was presented. The measurement verified the lifetime analysis procedure in a sample with higher statistical precision. Other effects studied include mass reflections; effects of the presence of a second charm particle; and mismeasurement of charm decays. Monte carlo simulations were used for the detailed study of systematic effects using the charm data.
Resumo:
Linear mixed effects models have been widely used in analysis of data where responses are clustered around some random effects, so it is not reasonable to assume independence between observations in the same cluster. In most biological applications, it is assumed that the distributions of the random effects and of the residuals are Gaussian. This makes inferences vulnerable to the presence of outliers. Here, linear mixed effects models with normal/independent residual distributions for robust inferences are described. Specific distributions examined include univariate and multivariate versions of the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted and Markov chain Monte Carlo is used to carry out the posterior analysis. The procedures are illustrated using birth weight data on rats in a texicological experiment. Results from the Gaussian and robust models are contrasted, and it is shown how the implementation can be used for outlier detection. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process in linear mixed models, and they are easily implemented using data augmentation and MCMC techniques.
Resumo:
Through the analyses of the Miyazawa-Jernigan matrix it has been shown that the hydrophobic effect generates the dominant driving force for protein folding. By using both lattice and off-lattice models, it is shown that hydrophobic-type potentials are indeed efficient in inducing the chain through nativelike configurations, but they fail to provide sufficient stability so as to keep the chain in the native state. However, through comparative Monte Carlo simulations, it is shown that hydrophobic potentials and steric constraints are two basic ingredients for the folding process. Specifically, it is shown that suitable pairwise steric constraints introduce strong changes on the configurational activity, whose main consequence is a huge increase in the overall stability condition of the native state; detailed analysis of the effects of steric constraints on the heat capacity and configurational activity are provided. The present results support the view that the folding problem of globular proteins can be approached as a process in which the mechanism to reach the native conformation and the requirements for the globule stability are uncoupled.
Resumo:
We present the results of a search for the flavor-changing neutral current decay Bs 0 → μ+ μ-. using a data set with integrated luminosity of 240 pb-1 of pp̄ collisions at √s = 1.96 TeV collected with the D0 detector in run II of the Fermilab Tevatron collider. We find the upper limit on the branching fraction to be B(Bs 0 → μ+ π-) ≤ 5.0 × 10-7 at the 95% C.L. assuming no contributions from the decay Bd 0 → μ+ μ- in the signal region. This limit is the most stringent upper bound on the branching fraction Bs 0 → μ+ μ- to date. © 2005 The American Physical Society.
Resumo:
Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2MeV, i.e., in the so-called Bethe-Bloch region. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions. © 2009 American Institute of Physics.
Resumo:
The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies. © 2010 American Institute of Physics.