960 resultados para Monte-Carlo approach
Resumo:
The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets. Copyright 2009.
Resumo:
The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finite-dimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets.
Resumo:
This work addresses the problem of estimating the optimal value function in a Markov Decision Process from observed state-action pairs. We adopt a Bayesian approach to inference, which allows both the model to be estimated and predictions about actions to be made in a unified framework, providing a principled approach to mimicry of a controller on the basis of observed data. A new Markov chain Monte Carlo (MCMC) sampler is devised for simulation from theposterior distribution over the optimal value function. This step includes a parameter expansion step, which is shown to be essential for good convergence properties of the MCMC sampler. As an illustration, the method is applied to learning a human controller.
Resumo:
Approximate Bayesian computation (ABC) has become a popular technique to facilitate Bayesian inference from complex models. In this article we present an ABC approximation designed to perform biased filtering for a Hidden Markov Model when the likelihood function is intractable. We use a sequential Monte Carlo (SMC) algorithm to both fit and sample from our ABC approximation of the target probability density. This approach is shown to, empirically, be more accurate w.r.t.~the original filter than competing methods. The theoretical bias of our method is investigated; it is shown that the bias goes to zero at the expense of increased computational effort. Our approach is illustrated on a constrained sequential lasso for portfolio allocation to 15 constituents of the FTSE 100 share index.
Resumo:
A summary is presented of research conducted on beach erosion associated with extreme storms and sea level rise. These results were developed by the author and graduate students under sponsorship of the University of Delaware Sea Grant Program. Various shoreline response problems of engineering interest are examined. The basis for the approach is a monotonic equilibrium profile of the form h = Ax2 /3 in which h is water depth at a distance x from the shoreline and A is a scale parameter depending primarily on sediment characteristics and secondarily on wave characteristics. This form is shown to be consistent with uniform wave energy dissipation per unit volume. The dependency of A on sediment size is quantified through laboratory and field data. Quasi-static beach response is examined to represent the effect of sea level rise. Cases considered include natural and seawalled profiles. To represent response to storms of realistic durations, a model is proposed in which the offshore transport is proportional to the "excess" energy dissipation per unit volume. The single rate constant in this model was evaluated based on large scale wave tank tests and confirmed with Hurricane Eloise pre- and post-storm surveys. It is shown that most hurricanes only cause 10% to 25% of the erosion potential associated with the peak storm tide and wave conditions. Additional applications include profile response employing a fairly realistic breaking model in which longshore bars are formed and long-term (500 years) Monte Carlo simulation including the contributions due to sea level rise and random storm occurrences. (PDF has 67 pages.)
Resumo:
Transmission investments are currently needed to meet an increasing electricity demand, to address security of supply concerns, and to reach carbon-emissions targets. A key issue when assessing the benefits from an expanded grid concerns the valuation of the uncertain cash flows that result from the expansion. We propose a valuation model that accommodates both physical and economic uncertainties following the Real Options approach. It combines optimization techniques with Monte Carlo simulation. We illustrate the use of our model in a simplified, two-node grid and assess the decision whether to invest or not in a particular upgrade. The generation mix includes coal-and natural gas-fired stations that operate under carbon constraints. The underlying parameters are estimated from observed market data.
Resumo:
Our recent studies on kinetic behaviors of gas flows are reviewed in this paper. These flows have a wide range of background, but share a common feature that the flow Knudsen number is larger than 0.01. Thus kinetic approaches such as the direct simulation Monte Carlo method are required for their description. In the past few years, we studied several micro/nano-scale flows by developing novel particle simulation approach, and investigated the flows in low-pressure chambers and at high altitude. In addition, the microscopic behaviors of a couple of classical flow problems were analyzed, which shows the potential for kinetic approaches to reveal the microscopic mechanism of gas flows.
Resumo:
G-protein coupled receptors (GPCRs) form a large family of proteins and are very important drug targets. They are membrane proteins, which makes computational prediction of their structure challenging. Homology modeling is further complicated by low sequence similarly of the GPCR superfamily.
In this dissertation, we analyze the conserved inter-helical contacts of recently solved crystal structures, and we develop a unified sequence-structural alignment of the GPCR superfamily. We use this method to align 817 human GPCRs, 399 of which are nonolfactory. This alignment can be used to generate high quality homology models for the 817 GPCRs.
To refine the provided GPCR homology models we developed the Trihelix sampling method. We use a multi-scale approach to simplify the problem by treating the transmembrane helices as rigid bodies. In contrast to Monte Carlo structure prediction methods, the Trihelix method does a complete local sampling using discretized coordinates for the transmembrane helices. We validate the method on existing structures and apply it to predict the structure of the lactate receptor, HCAR1. For this receptor, we also build extracellular loops by taking into account constraints from three disulfide bonds. Docking of lactate and 3,5-dihydroxybenzoic acid shows likely involvement of three Arg residues on different transmembrane helices in binding a single ligand molecule.
Protein structure prediction relies on accurate force fields. We next present an effort to improve the quality of charge assignment for large atomic models. In particular, we introduce the formalism of the polarizable charge equilibration scheme (PQEQ) and we describe its implementation in the molecular simulation package Lammps. PQEQ allows fast on the fly charge assignment even for reactive force fields.
Resumo:
The topological phases of matter have been a major part of condensed matter physics research since the discovery of the quantum Hall effect in the 1980s. Recently, much of this research has focused on the study of systems of free fermions, such as the integer quantum Hall effect, quantum spin Hall effect, and topological insulator. Though these free fermion systems can play host to a variety of interesting phenomena, the physics of interacting topological phases is even richer. Unfortunately, there is a shortage of theoretical tools that can be used to approach interacting problems. In this thesis I will discuss progress in using two different numerical techniques to study topological phases.
Recently much research in topological phases has focused on phases made up of bosons. Unlike fermions, free bosons form a condensate and so interactions are vital if the bosons are to realize a topological phase. Since these phases are difficult to study, much of our understanding comes from exactly solvable models, such as Kitaev's toric code, as well as Levin-Wen and Walker-Wang models. We may want to study systems for which such exactly solvable models are not available. In this thesis I present a series of models which are not solvable exactly, but which can be studied in sign-free Monte Carlo simulations. The models work by binding charges to point topological defects. They can be used to realize bosonic interacting versions of the quantum Hall effect in 2D and topological insulator in 3D. Effective field theories of "integer" (non-fractionalized) versions of these phases were available in the literature, but our models also allow for the construction of fractional phases. We can measure a number of properties of the bulk and surface of these phases.
Few interacting topological phases have been realized experimentally, but there is one very important exception: the fractional quantum Hall effect (FQHE). Though the fractional quantum Hall effect we discovered over 30 years ago, it can still produce novel phenomena. Of much recent interest is the existence of non-Abelian anyons in FQHE systems. Though it is possible to construct wave functions that realize such particles, whether these wavefunctions are the ground state is a difficult quantitative question that must be answered numerically. In this thesis I describe progress using a density-matrix renormalization group algorithm to study a bilayer system thought to host non-Abelian anyons. We find phase diagrams in terms of experimentally relevant parameters, and also find evidence for a non-Abelian phase known as the "interlayer Pfaffian".
Resumo:
A study is made of the accuracy of electronic digital computer calculations of ground displacement and response spectra from strong-motion earthquake accelerograms. This involves an investigation of methods of the preparatory reduction of accelerograms into a form useful for the digital computation and of the accuracy of subsequent digital calculations. Various checks are made for both the ground displacement and response spectra results, and it is concluded that the main errors are those involved in digitizing the original record. Differences resulting from various investigators digitizing the same experimental record may become as large as 100% of the maximum computed ground displacements. The spread of the results of ground displacement calculations is greater than that of the response spectra calculations. Standardized methods of adjustment and calculation are recommended, to minimize such errors.
Studies are made of the spread of response spectral values about their mean. The distribution is investigated experimentally by Monte Carlo techniques using an electric analog system with white noise excitation, and histograms are presented indicating the dependence of the distribution on the damping and period of the structure. Approximate distributions are obtained analytically by confirming and extending existing results with accurate digital computer calculations. A comparison of the experimental and analytical approaches indicates good agreement for low damping values where the approximations are valid. A family of distribution curves to be used in conjunction with existing average spectra is presented. The combination of analog and digital computations used with Monte Carlo techniques is a promising approach to the statistical problems of earthquake engineering.
Methods of analysis of very small earthquake ground motion records obtained simultaneously at different sites are discussed. The advantages of Fourier spectrum analysis for certain types of studies and methods of calculation of Fourier spectra are presented. The digitizing and analysis of several earthquake records is described and checks are made of the dependence of results on digitizing procedure, earthquake duration and integration step length. Possible dangers of a direct ratio comparison of Fourier spectra curves are pointed out and the necessity for some type of smoothing procedure before comparison is established. A standard method of analysis for the study of comparative ground motion at different sites is recommended.
Resumo:
Nesta dissertação são apresentados resultados de simulações Monte Carlo de fluorescência de raios X (XRF), utilizando o programa GEANT4, para medidas de espessura de revestimento metálico (Ni e Zn) em base metálica (Fe). As simulações foram feitas para dois tamanhos de espessura para cada metal de revestimento, (5μm e 10μm), com passos de 0,1 μm e 0,001 μm e com 106 histórias. No cálculo da espessura do revestimento foram feitas as aproximações de feixe de raios X monoenegético, com a análise da transmissão apenas da energia do K-alfa e para uma geometria compatível com um sistema real de medição (ARTAX-200). Os resultados mostraram a eficiência da metodologia de simulação e do cálculo da espessura do revestimento, o que permitirá futuros cálculos, inclusive para multirevestimentos metálicos em base metálica.
Resumo:
A generalized Bayesian population dynamics model was developed for analysis of historical mark-recapture studies. The Bayesian approach builds upon existing maximum likelihood methods and is useful when substantial uncertainties exist in the data or little information is available about auxiliary parameters such as tag loss and reporting rates. Movement rates are obtained through Markov-chain Monte-Carlo (MCMC) simulation, which are suitable for use as input in subsequent stock assessment analysis. The mark-recapture model was applied to English sole (Parophrys vetulus) off the west coast of the United States and Canada and migration rates were estimated to be 2% per month to the north and 4% per month to the south. These posterior parameter distributions and the Bayesian framework for comparing hypotheses can guide fishery scientists in structuring the spatial and temporal complexity of future analyses of this kind. This approach could be easily generalized for application to other species and more data-rich fishery analyses.
Resumo:
Molecular markers have been demonstrated to be useful for the estimation of stock mixture proportions where the origin of individuals is determined from baseline samples. Bayesian statistical methods are widely recognized as providing a preferable strategy for such analyses. In general, Bayesian estimation is based on standard latent class models using data augmentation through Markov chain Monte Carlo techniques. In this study, we introduce a novel approach based on recent developments in the estimation of genetic population structure. Our strategy combines analytical integration with stochastic optimization to identify stock mixtures. An important enhancement over previous methods is the possibility of appropriately handling data where only partial baseline sample information is available. We address the potential use of nonmolecular, auxiliary biological information in our Bayesian model.
Resumo:
Billfishes are a component of offshore ecosystems; thus it is important to quantify the impact of the tuna fishery on these species in the world’s ocean. The aim of this study was to assess the bycatch of billfishes generated by the tropical tuna purse-seine fishery in the eastern Atlantic Ocean. Information on bycatch was collected by observers at sea during the European Union Bigeye Program. With a total of 62 observers’ trips, conducted on Spanish and French vessels between June 1997 and May 1999, this project is the biggest observer program ever carried out in the European tuna purse-seine fishery. This study showed that billfish bycatch by the purse seiners is very low (less than 0.021% of the total tuna catches and less than 10% of the total billfish catches currently reported). A Monte Carlo simulation was performed to account for some uncertainties in the fishing strategies of purse seiners operating in this ocean. One of the findings of this study indicated that the temporary moratorium on fishing with FADs (fish aggregating devices), adopted by the European purse-seine fishery in the eastern Atlantic Ocean, produced a decrease in incidental catches of marlins from 600–700 metric tons (t) to less than 300 t. In contrast, this trend was reversed for sailfishes, for which the bycatch increased from 25 t to 45 t. The difficulty of defining indices that express the conservation status in marine fishes and that gauge key ecosystem parameters and the need to promote an ecosystem approach for large-pelagic-resource management which takes into account biologic and socioeconomic criteria are briefly discussed.
Resumo:
Many data are naturally modeled by an unobserved hierarchical structure. In this paper we propose a flexible nonparametric prior over unknown data hierarchies. The approach uses nested stick-breaking processes to allow for trees of unbounded width and depth, where data can live at any node and are infinitely exchangeable. One can view our model as providing infinite mixtures where the components have a dependency structure corresponding to an evolutionary diffusion down a tree. By using a stick-breaking approach, we can apply Markov chain Monte Carlo methods based on slice sampling to perform Bayesian inference and simulate from the posterior distribution on trees. We apply our method to hierarchical clustering of images and topic modeling of text data.