931 resultados para Discrete Markov Random Field Modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mixture models implemented via the expectation-maximization (EM) algorithm are being increasingly used in a wide range of problems in pattern recognition such as image segmentation. However, the EM algorithm requires considerable computational time in its application to huge data sets such as a three-dimensional magnetic resonance (MR) image of over 10 million voxels. Recently, it was shown that a sparse, incremental version of the EM algorithm could improve its rate of convergence. In this paper, we show how this modified EM algorithm can be speeded up further by adopting a multiresolution kd-tree structure in performing the E-step. The proposed algorithm outperforms some other variants of the EM algorithm for segmenting MR images of the human brain. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that even slight changes in nonuniform illumination lead to a large image variability and are crucial for many visual tasks. This paper presents a new ICA related probabilistic model where the number of sources exceeds the number of sensors to perform an image segmentation and illumination removal, simultaneously. We model illumination and reflectance in log space by a generalized autoregressive process and Hidden Gaussian Markov random field, respectively. The model ability to deal with segmentation of illuminated images is compared with a Canny edge detector and homomorphic filtering. We apply the model to two problems: synthetic image segmentation and sea surface pollution detection from intensity images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop, implement and study a new Bayesian spatial mixture model (BSMM). The proposed BSMM allows for spatial structure in the binary activation indicators through a latent thresholded Gaussian Markov random field. We develop a Gibbs (MCMC) sampler to perform posterior inference on the model parameters, which then allows us to assess the posterior probabilities of activation for each voxel. One purpose of this article is to compare the HJ model and the BSMM in terms of receiver operating characteristics (ROC) curves. Also we consider the accuracy of the spatial mixture model and the BSMM for estimation of the size of the activation region in terms of bias, variance and mean squared error. We perform a simulation study to examine the aforementioned characteristics under a variety of configurations of spatial mixture model and BSMM both as the size of the region changes and as the magnitude of activation changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, systems that extract information from millions of Internet documents have become commonplace. Knowledge graphs -- structured knowledge bases that describe entities, their attributes and the relationships between them -- are a powerful tool for understanding and organizing this vast amount of information. However, a significant obstacle to knowledge graph construction is the unreliability of the extracted information, due to noise and ambiguity in the underlying data or errors made by the extraction system and the complexity of reasoning about the dependencies between these noisy extractions. My dissertation addresses these challenges by exploiting the interdependencies between facts to improve the quality of the knowledge graph in a scalable framework. I introduce a new approach called knowledge graph identification (KGI), which resolves the entities, attributes and relationships in the knowledge graph by incorporating uncertain extractions from multiple sources, entity co-references, and ontological constraints. I define a probability distribution over possible knowledge graphs and infer the most probable knowledge graph using a combination of probabilistic and logical reasoning. Such probabilistic models are frequently dismissed due to scalability concerns, but my implementation of KGI maintains tractable performance on large problems through the use of hinge-loss Markov random fields, which have a convex inference objective. This allows the inference of large knowledge graphs using 4M facts and 20M ground constraints in 2 hours. To further scale the solution, I develop a distributed approach to the KGI problem which runs in parallel across multiple machines, reducing inference time by 90%. Finally, I extend my model to the streaming setting, where a knowledge graph is continuously updated by incorporating newly extracted facts. I devise a general approach for approximately updating inference in convex probabilistic models, and quantify the approximation error by defining and bounding inference regret for online models. Together, my work retains the attractive features of probabilistic models while providing the scalability necessary for large-scale knowledge graph construction. These models have been applied on a number of real-world knowledge graph projects, including the NELL project at Carnegie Mellon and the Google Knowledge Graph.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The LiHoxY1-xF4 magnetic material in a transverse magnetic field Bxx̂ perpendicular to the Ising spin direction has long been used to study tunable quantum phase transitions in a random disordered system. We show that the Bx-induced magnetization along the x̂ direction, combined with the local random dilution-induced destruction of crystalline symmetries, generates, via the predominant dipolar interactions between Ho3+ ions, random fields along the Ising ẑ direction. This identifies LiHoxY1-xF4 in Bx as a new random field Ising system. The random fields explain the rapid decrease of the critical temperature in the diluted ferromagnetic regime and the smearing of the nonlinear susceptibility at the spin-glass transition with increasing Bx and render the Bx-induced quantum criticality in LiHoxY1-xF4 likely inaccessible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Colloid self-assembly under external control is a new route to fabrication of advanced materials with novel microstructures and appealing functionalities. The kinetic processes of colloidal self-assembly have attracted great interests also because they are similar to many atomic level kinetic processes of materials. In the past decades, rapid technological progresses have been achieved on producing shape-anisotropic, patchy, core-shell structured particles and particles with electric/magnetic charges/dipoles, which greatly enriched the self-assembled structures. Multi-phase carrier liquids offer new route to controlling colloidal self-assembly. Therefore, heterogeneity is the essential characteristics of colloid system, while so far there still lacks a model that is able to efficiently incorporate these possible heterogeneities. This thesis is mainly devoted to development of a model and computational study on the complex colloid system through a diffuse-interface field approach (DIFA), recently developed by Wang et al. This meso-scale model is able to describe arbitrary particle shape and arbitrary charge/dipole distribution on the surface or body of particles. Within the framework of DIFA, a Gibbs-Duhem-type formula is introduced to treat Laplace pressure in multi-liquid-phase colloidal system and it obeys Young-Laplace equation. The model is thus capable to quantitatively study important capillarity related phenomena. Extensive computer simulations are performed to study the fundamental behavior of heterogeneous colloidal system. The role of Laplace pressure is revealed in determining the mechanical equilibrium of shape-anisotropic particles at fluid interfaces. In particular, it is found that the Laplace pressure plays a critical role in maintaining the stability of capillary bridges between close particles, which sheds light on a novel route to in situ firming compact but fragile colloidal microstructures via capillary bridges. Simulation results also show that competition between like-charge repulsion, dipole-dipole interaction and Brownian motion dictates the degree of aggregation of heterogeneously charged particles. Assembly and alignment of particles with magnetic dipoles under external field is studied. Finally, extended studies on the role of dipole-dipole interaction are performed for ferromagnetic and ferroelectric domain phenomena. The results reveal that the internal field generated by dipoles competes with external field to determine the dipole-domain evolution in ferroic materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of random probability measures is a lively research topic that has attracted interest from different fields in recent years. In this thesis, we consider random probability measures in the context of Bayesian nonparametrics, where the law of a random probability measure is used as prior distribution, and in the context of distributional data analysis, where the goal is to perform inference given avsample from the law of a random probability measure. The contributions contained in this thesis can be subdivided according to three different topics: (i) the use of almost surely discrete repulsive random measures (i.e., whose support points are well separated) for Bayesian model-based clustering, (ii) the proposal of new laws for collections of random probability measures for Bayesian density estimation of partially exchangeable data subdivided into different groups, and (iii) the study of principal component analysis and regression models for probability distributions seen as elements of the 2-Wasserstein space. Specifically, for point (i) above we propose an efficient Markov chain Monte Carlo algorithm for posterior inference, which sidesteps the need of split-merge reversible jump moves typically associated with poor performance, we propose a model for clustering high-dimensional data by introducing a novel class of anisotropic determinantal point processes, and study the distributional properties of the repulsive measures, shedding light on important theoretical results which enable more principled prior elicitation and more efficient posterior simulation algorithms. For point (ii) above, we consider several models suitable for clustering homogeneous populations, inducing spatial dependence across groups of data, extracting the characteristic traits common to all the data-groups, and propose a novel vector autoregressive model to study of growth curves of Singaporean kids. Finally, for point (iii), we propose a novel class of projected statistical methods for distributional data analysis for measures on the real line and on the unit-circle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a Random Walk in Random Environment (RWRE) moving in an i.i.d. random field of obstacles. When the particle hits an obstacle, it disappears with a positive probability. We obtain quenched and annealed bounds on the tails of the survival time in the general d-dimensional case. We then consider a simplified one-dimensional model (where transition probabilities and obstacles are independent and the RWRE only moves to neighbour sites), and obtain finer results for the tail of the survival time. In addition, we study also the ""mixed"" probability measures (quenched with respect to the obstacles and annealed with respect to the transition probabilities and vice-versa) and give results for tails of the survival time with respect to these probability measures. Further, we apply the same methods to obtain bounds for the tails of hitting times of Branching Random Walks in Random Environment (BRWRE).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of interaction neighborhood estimation from the partial observation of a finite number of realizations of a random field. We introduce a model selection rule to choose estimators of conditional probabilities among natural candidates. Our main result is an oracle inequality satisfied by the resulting estimator. We use then this selection rule in a two-step procedure to evaluate the interacting neighborhoods. The selection rule selects a small prior set of possible interacting points and a cutting step remove from this prior set the irrelevant points. We also prove that the Ising models satisfy the assumptions of the main theorems, without restrictions on the temperature, on the structure of the interacting graph or on the range of the interactions. It provides therefore a large class of applications for our results. We give a computationally efficient procedure in these models. We finally show the practical efficiency of our approach in a simulation study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we deal with a generalized multi-period mean-variance portfolio selection problem with market parameters Subject to Markov random regime switchings. Problems of this kind have been recently considered in the literature for control over bankruptcy, for cases in which there are no jumps in market parameters (see [Zhu, S. S., Li, D., & Wang, S. Y. (2004). Risk control over bankruptcy in dynamic portfolio selection: A generalized mean variance formulation. IEEE Transactions on Automatic Control, 49, 447-457]). We present necessary and Sufficient conditions for obtaining an optimal control policy for this Markovian generalized multi-period meal-variance problem, based on a set of interconnected Riccati difference equations, and oil a set of other recursive equations. Some closed formulas are also derived for two special cases, extending some previous results in the literature. We apply the results to a numerical example with real data for Fisk control over bankruptcy Ill a dynamic portfolio selection problem with Markov jumps selection problem. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new conceptual model for soil pore-solid structure is formalized. Soil pore-solid structure is proposed to comprise spatially abutting elements each with a value which is its membership to the fuzzy set ''pore,'' termed porosity. These values have a range between zero (all solid) and unity (all pore). Images are used to represent structures in which the elements are pixels and the value of each is a porosity. Two-dimensional random fields are generated by allocating each pixel a porosity by independently sampling a statistical distribution. These random fields are reorganized into other pore-solid structural types by selecting parent points which have a specified local region of influence. Pixels of larger or smaller porosity are aggregated about the parent points and within the region of interest by controlled swapping of pixels in the image. This creates local regions of homogeneity within the random field. This is similar to the process known as simulated annealing. The resulting structures are characterized using one-and two-dimensional variograms and functions describing their connectivity. A variety of examples of structures created by the model is presented and compared. Extension to three dimensions presents no theoretical difficulties and is currently under development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To investigate the evolution of delirium of nursing home (NH) residents and their possible predictors. DESIGN: Post-hoc analysis of a prospective cohort assessment. SETTING: Ninety NHs in Switzerland. PARTICIPANTS: Included 14,771 NH residents. MEASUREMENTS: The Resident Assessment Instrument Minimum Data Set and the Nursing Home Confusion Assessment Method were used to determine follow-up of subsyndromal or full delirium in NH residents using discrete Markov chain modeling to describe long-term trajectories and multiple logistic regression analyses to determine predictors of the trajectories. RESULTS: We identified four major types of delirium time courses in NH. Increasing severity of cognitive impairment and of depressive symptoms at the initial assessment predicted the different delirium time courses. CONCLUSION: More pronounced cognitive impairment and depressive symptoms at the initial assessment are associated with different subsequent evolutions of delirium. The presence and evolution of delirium in the first year after NH admission predicted the subsequent course of delirium until death.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have developed a technique called RISE (Random Image Structure Evolution), by which one may systematically sample continuous paths in a high-dimensional image space. A basic RISE sequence depicts the evolution of an object's image from a random field, along with the reverse sequence which depicts the transformation of this image back into randomness. The processing steps are designed to ensure that important low-level image attributes such as the frequency spectrum and luminance are held constant throughout a RISE sequence. Experiments based on the RISE paradigm can be used to address some key open issues in object perception. These include determining the neural substrates underlying object perception, the role of prior knowledge and expectation in object perception, and the developmental changes in object perception skills from infancy to adulthood.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exercises and solutions in PDF