971 resultados para Sequential Monte Carlo methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is composed of three articles with the subjects of macroeconomics and - nance. Each article corresponds to a chapter and is done in paper format. In the rst article, which was done with Axel Simonsen, we model and estimate a small open economy for the Canadian economy in a two country General Equilibrium (DSGE) framework. We show that it is important to account for the correlation between Domestic and Foreign shocks and for the Incomplete Pass-Through. In the second chapter-paper, which was done with Hedibert Freitas Lopes, we estimate a Regime-switching Macro-Finance model for the term-structure of interest rates to study the US post-World War II (WWII) joint behavior of macro-variables and the yield-curve. We show that our model tracks well the US NBER cycles, the addition of changes of regime are important to explain the Expectation Theory of the term structure, and macro-variables have increasing importance in recessions to explain the variability of the yield curve. We also present a novel sequential Monte-Carlo algorithm to learn about the parameters and the latent states of the Economy. In the third chapter, I present a Gaussian A ne Term Structure Model (ATSM) with latent jumps in order to address two questions: (1) what are the implications of incorporating jumps in an ATSM for Asian option pricing, in the particular case of the Brazilian DI Index (IDI) option, and (2) how jumps and options a ect the bond risk-premia dynamics. I show that jump risk-premia is negative in a scenario of decreasing interest rates (my sample period) and is important to explain the level of yields, and that gaussian models without jumps and with constant intensity jumps are good to price Asian options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social organization is an important component of the population biology of a species that influences gene flow, the spatial pattern and scale of movements, and the effects of predation or exploitation by humans. An important element of social structure in mammals is group fidelity, which can be quantified through association indices. To describe the social organization of marine tucuxi dolphins (Sotalia guianensis) found in the Cananeia estuary, southeastern Brazil, association indices were applied to photo-identification data to characterize the temporal stability of relationships among members of this population. Eighty-seven days of fieldwork were conducted from May 2000 to July 2003, resulting in direct observations of 374 distinct groups. A total of 138 dolphins were identified on 1-38 distinct field days. Lone dolphins were rarely seen, whereas groups were composed of up to 60 individuals (mean +/- 1 SD = 12.4 +/- 11.4 individuals per group). A total of 29,327 photographs were analyzed, of which 6,312 (21.5%) were considered useful for identifying individuals. Half-weight and simple ratio indices were used to investigate associations among S. guianensis as revealed by the entire data set, data from the core study site, and data from groups composed of <= 10 individuals. Monte Carlo methods indicated that only 3 (9.3%) of 32 association matrices differed significantly from expectations based on random association. Thus, our study suggests that stable associations are not characteristic of S. guianensis in the Cananeia estuary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rural-urban migration phenomenon is analyzed by using an agent-based computational model. Agents are placed on lattices which dimensions varying from d = 2 up to d = 7. The localization of the agents in the lattice defines that their social neighborhood (rural or urban) is not related to their spatial distribution. The effect of the dimension of lattice is studied by analyzing the variation of the main parameters that characterizes the migratory process. The dynamics displays strong effects even for around one million of sites, in higher dimensions (d = 6, 7).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proton computerized tomography deals with relatively thick targets like the human head or trunk. In this case precise analytical calculation of the proton final energy is a rather complicated task, thus the Monte Carlo simulation stands out as a solution. We used the GEANT4.8.2 code to calculate the proton final energy spectra after passing a thick Al absorber and compared it with the same conditions of the experimental data. The ICRU49, Ziegler85 and Ziegler2000 models from the low energy extension pack were used. The results were also compared with the SRIM2008 and MCNPX2.4 simulations, and with solutions of the Boltzmann transport equation in the Fokker-Planck approximation. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generalized exponential distribution, proposed by Gupta and Kundu (1999), is a good alternative to standard lifetime distributions as exponential, Weibull or gamma. Several authors have considered the problem of Bayesian estimation of the parameters of generalized exponential distribution, assuming independent gamma priors and other informative priors. In this paper, we consider a Bayesian analysis of the generalized exponential distribution by assuming the conventional non-informative prior distributions, as Jeffreys and reference prior, to estimate the parameters. These priors are compared with independent gamma priors for both parameters. The comparison is carried out by examining the frequentist coverage probabilities of Bayesian credible intervals. We shown that maximal data information prior implies in an improper posterior distribution for the parameters of a generalized exponential distribution. It is also shown that the choice of a parameter of interest is very important for the reference prior. The different choices lead to different reference priors in this case. Numerical inference is illustrated for the parameters by considering data set of different sizes and using MCMC (Markov Chain Monte Carlo) methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the average performance of a general class of learning algorithms for the nondeterministic polynomial time complete problem of rule extraction by a binary perceptron. The examples are generated by a rule implemented by a teacher network of similar architecture. A variational approach is used in trying to identify the potential energy that leads to the largest generalization in the thermodynamic limit. We restrict our search to algorithms that always satisfy the binary constraints. A replica symmetric ansatz leads to a learning algorithm which presents a phase transition in violation of an information theoretical bound. Stability analysis shows that this is due to a failure of the replica symmetric ansatz and the first step of replica symmetry breaking (RSB) is studied. The variational method does not determine a unique potential but it allows construction of a class with a unique minimum within each first order valley. Members of this class improve on the performance of Gibbs algorithm but fail to reach the Bayesian limit in the low generalization phase. They even fail to reach the performance of the best binary, an optimal clipping of the barycenter of version space. We find a trade-off between a good low performance and early onset of perfect generalization. Although the RSB may be locally stable we discuss the possibility that it fails to be the correct saddle point globally. ©2000 The American Physical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A nonthermal quantum mechanical statistical fragmentation model based on tunneling of particles through potential barriers is studied in compact two- and three-dimensional systems. It is shown that this fragmentation dynamics gives origin to several static and dynamic scaling relations. The critical exponents are found and compared with those obtained in classical statistical models of fragmentation of general interest, in particular with thermal fragmentation involving classical processes over potential barriers. Besides its general theoretical interest, the fragmentation dynamics discussed here is complementary to classical fragmentation dynamics of interest in chemical kinetics and can be useful in the study of a number of other dynamic processes such as nuclear fragmentation. ©2000 The American Physical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A measurement technique of charm baryons lifetimes from hadro-production data was presented. The measurement verified the lifetime analysis procedure in a sample with higher statistical precision. Other effects studied include mass reflections; effects of the presence of a second charm particle; and mismeasurement of charm decays. Monte carlo simulations were used for the detailed study of systematic effects using the charm data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through the analyses of the Miyazawa-Jernigan matrix it has been shown that the hydrophobic effect generates the dominant driving force for protein folding. By using both lattice and off-lattice models, it is shown that hydrophobic-type potentials are indeed efficient in inducing the chain through nativelike configurations, but they fail to provide sufficient stability so as to keep the chain in the native state. However, through comparative Monte Carlo simulations, it is shown that hydrophobic potentials and steric constraints are two basic ingredients for the folding process. Specifically, it is shown that suitable pairwise steric constraints introduce strong changes on the configurational activity, whose main consequence is a huge increase in the overall stability condition of the native state; detailed analysis of the effects of steric constraints on the heat capacity and configurational activity are provided. The present results support the view that the folding problem of globular proteins can be approached as a process in which the mechanism to reach the native conformation and the requirements for the globule stability are uncoupled.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the results of a search for the flavor-changing neutral current decay Bs 0 → μ+ μ-. using a data set with integrated luminosity of 240 pb-1 of pp̄ collisions at √s = 1.96 TeV collected with the D0 detector in run II of the Fermilab Tevatron collider. We find the upper limit on the branching fraction to be B(Bs 0 → μ+ π-) ≤ 5.0 × 10-7 at the 95% C.L. assuming no contributions from the decay Bd 0 → μ+ μ- in the signal region. This limit is the most stringent upper bound on the branching fraction Bs 0 → μ+ μ- to date. © 2005 The American Physical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to present designs for an accelerated life test (ALT). Design/methodology/approach - Bayesian methods and simulation Monte Carlo Markov Chain (MCMC) methods were used. Findings - In the paper a Bayesian method based on MCMC for ALT under EW distribution (for life time) and Arrhenius models (relating the stress variable and parameters) was proposed. The paper can conclude that it is a reasonable alternative to the classical statistical methods since the implementation of the proposed method is simple, not requiring advanced computational understanding and inferences on the parameters can be made easily. By the predictive density of a future observation, a procedure was developed to plan ALT and also to verify if the conformance fraction of the manufactured process reaches some desired level of quality. This procedure is useful for statistical process control in many industrial applications. Research limitations/implications - The results may be applied in a semiconductor manufacturer. Originality/value - The Exponentiated-Weibull-Arrhenius model has never before been used to plan an ALT. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2MeV, i.e., in the so-called Bethe-Bloch region. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions. © 2009 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies. © 2010 American Institute of Physics.