980 resultados para Density Estimation, Gibbs Sampling, Markov Chain Monte Carlo, Markov Random Field Metropolis-Hastings Algorithm, Posterior Simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The author studies the error and complexity of the discrete random walk Monte Carlo technique for radiosity, using both the shooting and gathering methods. The author shows that the shooting method exhibits a lower complexity than the gathering one, and under some constraints, it has a linear complexity. This is an improvement over a previous result that pointed to an O(n log n) complexity. The author gives and compares three unbiased estimators for each method, and obtains closed forms and bounds for their variances. The author also bounds the expected value of the mean square error (MSE). Some of the results obtained are also shown

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finding the smallest eigenvalue of a given square matrix A of order n is computationally very intensive problem. The most popular method for this problem is the Inverse Power Method which uses LU-decomposition and forward and backward solving of the factored system at every iteration step. An alternative to this method is the Resolvent Monte Carlo method which uses representation of the resolvent matrix [I -qA](-m) as a series and then performs Monte Carlo iterations (random walks) on the elements of the matrix. This leads to great savings in computations, but the method has many restrictions and a very slow convergence. In this paper we propose a method that includes fast Monte Carlo procedure for finding the inverse matrix, refinement procedure to improve approximation of the inverse if necessary, and Monte Carlo power iterations to compute the smallest eigenvalue. We provide not only theoretical estimations about accuracy and convergence but also results from numerical tests performed on a number of test matrices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Camera traps are used to estimate densities or abundances using capture-recapture and, more recently, random encounter models (REMs). We deploy REMs to describe an invasive-native species replacement process, and to demonstrate their wider application beyond abundance estimation. The Irish hare Lepus timidus hibernicus is a high priority endemic of conservation concern. It is threatened by an expanding population of non-native, European hares L. europaeus, an invasive species of global importance. Camera traps were deployed in thirteen 1 km squares, wherein the ratio of invader to native densities were corroborated by night-driven line transect distance sampling throughout the study area of 1652 km2. Spatial patterns of invasive and native densities between the invader’s core and peripheral ranges, and native allopatry, were comparable between methods. Native densities in the peripheral range were comparable to those in native allopatry using REM, or marginally depressed using Distance Sampling. Numbers of the invader were substantially higher than the native in the core range, irrespective of method, with a 5:1 invader-to-native ratio indicating species replacement. We also describe a post hoc optimization protocol for REM which will inform subsequent (re-)surveys, allowing survey effort (camera hours) to be reduced by up to 57% without compromising the width of confidence intervals associated with density estimates. This approach will form the basis of a more cost-effective means of surveillance and monitoring for both the endemic and invasive species. The European hare undoubtedly represents a significant threat to the endemic Irish hare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activity recognition is an important issue in building intelligent monitoring systems. We address the recognition of multilevel activities in this paper via a conditional Markov random field (MRF), known as the dynamic conditional random field (DCRF). Parameter estimation in general MRFs using maximum likelihood is known to be computationally challenging (except for extreme cases), and thus we propose an efficient boosting-based algorithm AdaBoost.MRF for this task. Distinct from most existing work, our algorithm can handle hidden variables (missing labels) and is particularly attractive for smarthouse domains where reliable labels are often sparsely observed. Furthermore, our method works exclusively on trees and thus is guaranteed to converge. We apply the AdaBoost.MRF algorithm to a home video surveillance application and demonstrate its efficacy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some machine learning methods do not exploit contextual information in the process of discovering, describing and recognizing patterns. However, spatial/temporal neighboring samples are likely to have same behavior. Here, we propose an approach which unifies a supervised learning algorithm - namely Optimum-Path Forest - together with a Markov Random Field in order to build a prior model holding a spatial smoothness assumption, which takes into account the contextual information for classification purposes. We show its robustness for brain tissue classification over some images of the well-known dataset IBSR. © 2013 Springer-Verlag.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper has two goals. First to present a natural example of a new class of random fields which are the variable neighborhood random fields. The example we consider is a partially observed nearest neighbor binary Markov random field. The second goal is to establish sufficient conditions ensuring that the variable neighborhoods are almost surely finite. We discuss the relationship between the almost sure finiteness of the interaction neighborhoods and the presence/absence of phase transition of the underlying Markov random field. In the case where the underlying random field has no phase transition we show that the finiteness of neighborhoods depends on a specific relation between the noise level and the minimum values of the one-point specification of the Markov random field. The case in which there is phase transition is addressed in the frame of the ferromagnetic Ising model. We prove that the existence of infinite interaction neighborhoods depends on the phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even though a large amount of evidence would suggest that PP2A serine/threonine protein phosphatase acts as a tumour suppressor the genomics data to support this claim is limited. We fit a sparse binary Markov random field with individual sample's total mutational frequency as an additional covariate to model the dependencies between the mutations occurring in the PP2A encoding genes. We utilize the data from recent large scale cancer genomics studies, where the whole genome from a human tumour biopsy has been analysed. Our results show a complex network of interactions between the occurrence of mutations in our twenty examined genes. According to our analysis the mutations occurring in the genes PPP2R1A, PPP2R3A, and PPP2R2B are identified as the key mutations. These genes form the core of the network of conditional dependency between the mutations in the investigated twenty genes. Additionally, we note that the mutations occurring in PPP2R4 seem to be more influential in samples with higher number of total mutations. The mutations occurring in the set of genes suggested by our results has been shown to contribute to the transformation of human cells. We conclude that our evidence further supports the claim that PP2A acts as a tumour suppressor and restoring PP2A activity is an appealing therapeutic strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recommender systems play a central role in providing individualized access to information and services. This paper focuses on collaborative filtering, an approach that exploits the shared structure among mind-liked users and similar items. In particular, we focus on a formal probabilistic framework known as Markov random fields (MRF). We address the open problem of structure learning and introduce a sparsity-inducing algorithm to automatically estimate the interaction structures between users and between items. Item-item and user-user correlation networks are obtained as a by-product. Large-scale experiments on movie recommendation and date matching datasets demonstrate the power of the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present methods for fixed-lag smoothing using Sequential Importance sampling (SIS) on a discrete non-linear, non-Gaussian state space system with unknown parameters. Our particular application is in the field of digital communication systems. Each input data point is taken from a finite set of symbols. We represent transmission media as a fixed filter with a finite impulse response (FIR), hence a discrete state-space system is formed. Conventional Markov chain Monte Carlo (MCMC) techniques such as the Gibbs sampler are unsuitable for this task because they can only perform processing on a batch of data. Data arrives sequentially, so it would seem sensible to process it in this way. In addition, many communication systems are interactive, so there is a maximum level of latency that can be tolerated before a symbol is decoded. We will demonstrate this method by simulation and compare its performance to existing techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the utility to computational Bayesian analyses of a particular family of recursive marginal likelihood estimators characterized by the (equivalent) algorithms known as "biased sampling" or "reverse logistic regression" in the statistics literature and "the density of states" in physics. Through a pair of numerical examples (including mixture modeling of the well-known galaxy dataset) we highlight the remarkable diversity of sampling schemes amenable to such recursive normalization, as well as the notable efficiency of the resulting pseudo-mixture distributions for gauging prior-sensitivity in the Bayesian model selection context. Our key theoretical contributions are to introduce a novel heuristic ("thermodynamic integration via importance sampling") for qualifying the role of the bridging sequence in this procedure, and to reveal various connections between these recursive estimators and the nested sampling technique.