943 resultados para VARIABLE NEIGHBORHOOD RANDOM FIELDS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Leipholz column which is having the Young modulus and mass per unit length as stochastic processes and also the distributed tangential follower load behaving stochastically is considered. The non self-adjoint differential equation and boundary conditions are considered to have random field coefficients. The standard perturbation method is employed. The non self-adjoint operators are used within the regularity domain. Full covariance structure of the free vibration eigenvalues and critical loads is derived in terms of second order properties of input random fields characterizing the system parameter fluctuations. The mean value of critical load is calculated using the averaged problem and the corresponding eigenvalue statistics are sought. Through the frequency equation a transformation is done to yield load parameter statistics. A numerical study incorporating commonly observed correlation models is reported which illustrates the full potentials of the derived expressions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider the application of belief propagation (BP) to achieve near-optimal signal detection in large multiple-input multiple-output (MIMO) systems at low complexities. Large-MIMO architectures based on spatial multiplexing (V-BLAST) as well as non-orthogonal space-time block codes(STBC) from cyclic division algebra (CDA) are considered. We adopt graphical models based on Markov random fields (MRF) and factor graphs (FG). In the MRF based approach, we use pairwise compatibility functions although the graphical models of MIMO systems are fully/densely connected. In the FG approach, we employ a Gaussian approximation (GA) of the multi-antenna interference, which significantly reduces the complexity while achieving very good performance for large dimensions. We show that i) both MRF and FG based BP approaches exhibit large-system behavior, where increasingly closer to optimal performance is achieved with increasing number of dimensions, and ii) damping of messages/beliefs significantly improves the bit error performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural Support Vector Machines (SSVMs) and Conditional Random Fields (CRFs) are popular discriminative methods used for classifying structured and complex objects like parse trees, image segments and part-of-speech tags. The datasets involved are very large dimensional, and the models designed using typical training algorithms for SSVMs and CRFs are non-sparse. This non-sparse nature of models results in slow inference. Thus, there is a need to devise new algorithms for sparse SSVM and CRF classifier design. Use of elastic net and L1-regularizer has already been explored for solving primal CRF and SSVM problems, respectively, to design sparse classifiers. In this work, we focus on dual elastic net regularized SSVM and CRF. By exploiting the weakly coupled structure of these convex programming problems, we propose a new sequential alternating proximal (SAP) algorithm to solve these dual problems. This algorithm works by sequentially visiting each training set example and solving a simple subproblem restricted to a small subset of variables associated with that example. Numerical experiments on various benchmark sequence labeling datasets demonstrate that the proposed algorithm scales well. Further, the classifiers designed are sparser than those designed by solving the respective primal problems and demonstrate comparable generalization performance. Thus, the proposed SAP algorithm is a useful alternative for sparse SSVM and CRF classifier design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling the spatial variability that exists in pavement systems can be conveniently represented by means of random fields; in this study, a probabilistic analysis that considers the spatial variability, including the anisotropic nature of the pavement layer properties, is presented. The integration of the spatially varying log-normal random fields into a linear-elastic finite difference analysis has been achieved through the expansion optimal linear estimation method. For the estimation of the critical pavement responses, metamodels based on polynomial chaos expansion (PCE) are developed to replace the computationally expensive finite-difference model. The sparse polynomial chaos expansion based on an adaptive regression-based algorithm, and enhanced by the combined use of the global sensitivity analysis (GSA) is used, with significant savings in computational effort. The effect of anisotropy in each layer on the pavement responses was studied separately, and an effort is made to identify the pavement layer wherein the introduction of anisotropic characteristics results in the most significant impact on the critical strains. It is observed that the anisotropy in the base layer has a significant but diverse effect on both critical strains. While the compressive strain tends to be considerably higher than that observed for the isotropic section, the tensile strains show a decrease in the mean value with the introduction of base-layer anisotropy. Furthermore, asphalt-layer anisotropy also tends to decrease the critical tensile strain while having little effect on the critical compressive strain. (C) 2015 American Society of Civil Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a technique for obtaining the response of linear structural systems with parameter uncertainties subjected to either deterministic or random excitation. The parameter uncertainties are modeled as random variables or random fields, and are assumed to be time-independent. The new method is an extension of the deterministic finite element method to the space of random functions.

First, the general formulation of the method is developed, in the case where the excitation is deterministic in time. Next, the application of this formulation to systems satisfying the one-dimensional wave equation with uncertainty in their physical properties is described. A particular physical conceptualization of this equation is chosen for study, and some engineering applications are discussed in both an earthquake ground motion and a structural context.

Finally, the formulation of the new method is extended to include cases where the excitation is random in time. Application of this formulation to the random response of a primary-secondary system is described. It is found that parameter uncertainties can have a strong effect on the system response characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As técnicas de injeção de traçadores têm sido amplamente utilizadas na investigação de escoamentos em meios porosos, principalmente em problemas envolvendo a simulação numérica de escoamentos miscíveis em reservatórios de petróleo e o transporte de contaminantes em aquíferos. Reservatórios subterrâneos são em geral heterogêneos e podem apresentar variações significativas das suas propriedades em várias escalas de comprimento. Estas variações espaciais são incorporadas às equações que governam o escoamento no interior do meio poroso por meio de campos aleatórios. Estes campos podem prover uma descrição das heterogeneidades da formação subterrânea nos casos onde o conhecimento geológico não fornece o detalhamento necessário para a predição determinística do escoamento através do meio poroso. Nesta tese é empregado um modelo lognormal para o campo de permeabilidades a fim de reproduzir-se a distribuição de permeabilidades do meio real, e a geração numérica destes campos aleatórios é feita pelo método da Soma Sucessiva de Campos Gaussianos Independentes (SSCGI). O objetivo principal deste trabalho é o estudo da quantificação de incertezas para o problema inverso do transporte de um traçador em um meio poroso heterogêneo empregando uma abordagem Bayesiana para a atualização dos campos de permeabilidades, baseada na medição dos valores da concentração espacial do traçador em tempos específicos. Um método do tipo Markov Chain Monte Carlo a dois estágios é utilizado na amostragem da distribuição de probabilidade a posteriori e a cadeia de Markov é construída a partir da reconstrução aleatória dos campos de permeabilidades. Na resolução do problema de pressão-velocidade que governa o escoamento empregase um método do tipo Elementos Finitos Mistos adequado para o cálculo acurado dos fluxos em campos de permeabilidades heterogêneos e uma abordagem Lagrangiana, o método Forward Integral Tracking (FIT), é utilizada na simulação numérica do problema do transporte do traçador. Resultados numéricos são obtidos e apresentados para um conjunto de realizações amostrais dos campos de permeabilidades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a model for early vision tasks such as denoising, super-resolution, deblurring, and demosaicing. The model provides a resolution-independent representation of discrete images which admits a truly rotationally invariant prior. The model generalizes several existing approaches: variational methods, finite element methods, and discrete random fields. The primary contribution is a novel energy functional which has not previously been written down, which combines the discrete measurements from pixels with a continuous-domain world viewed through continous-domain point-spread functions. The value of the functional is that simple priors (such as total variation and generalizations) on the continous-domain world become realistic priors on the sampled images. We show that despite its apparent complexity, optimization of this model depends on just a few computational primitives, which although tedious to derive, can now be reused in many domains. We define a set of optimization algorithms which greatly overcome the apparent complexity of this model, and make possible its practical application. New experimental results include infinite-resolution upsampling, and a method for obtaining subpixel superpixels. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a conceptually novel structured prediction model, GPstruct, which is kernelized, non-parametric and Bayesian, by design. We motivate the model with respect to existing approaches, among others, conditional random fields (CRFs), maximum margin Markov networks (M3N), and structured support vector machines (SVMstruct), which embody only a subset of its properties. We present an inference procedure based on Markov Chain Monte Carlo. The framework can be instantiated for a wide range of structured objects such as linear chains, trees, grids, and other general graphs. As a proof of concept, the model is benchmarked on several natural language processing tasks and a video gesture segmentation task involving a linear chain structure. We show prediction accuracies for GPstruct which are comparable to or exceeding those of CRFs and SVMstruct.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

随着互联网和电子化办公的发展,出现了大量的文本资源。信息抽取技术可以帮助人们快速获取大规模文本中的有用信息。命名体识别与关系抽取是信息抽取的两个基本任务。本文在调研当前命名体识别和实体关系抽取中采用的主要方法的基础上,分别给出了解决方案。论文开展的主要工作有:(1)从模型选择和特征选择两个方面总结了命名体识别及实体关系抽取的国内外研究现状,重点介绍用于命名体识别的统计学习方法以及用于实体关系抽取的基于核的方法。(2)针对当前命名体识别中命名体片段边界的确定问题,研究了如何将 Semi-Markov CRFs 模型应用于中文命名体识别。这种模型只要求段间遵循马尔科夫规则,而段内的文本之间则可以被灵活的赋予各种规则。将这种模型用于中文命名体识别任务时,我们可以更有效更自由的设计出各种有利于识别出命名体片段边界的特征。实验表明,加入段相关的特征后,命名体识别的性能提高了 4-5 个百分点。(3)实体关系抽取的任务是判别两个实体之间的语义关系。之前的研究已经表明,待判别关系的两个实体间的语法树结构对于确定二者的关系类别是非常有用的,而相对成熟的基于平面特征的关系抽取方法在充分提取语法树结构特征方面的能力有限,因此,本文研究了基于核的中文实体关系抽取方法。针对中文特点,我们探讨了卷积(Convolution)核中使用不同的语法树对中文实体关系抽取性能的影响,构造了几种基于卷积核的复合核,改进了最短路依赖核。因为核方法开始被用于英文关系抽取时,F1 值也只有40%左右,而我们只使用作用在语法树上的卷积核时,中文关系抽取的F1 值达到了35%,可见核方法对中文关系抽取也是有效的。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we study the general problem of reconstructing a function, defined on a finite lattice from a set of incomplete, noisy and/or ambiguous observations. The goal of this work is to demonstrate the generality and practical value of a probabilistic (in particular, Bayesian) approach to this problem, particularly in the context of Computer Vision. In this approach, the prior knowledge about the solution is expressed in the form of a Gibbsian probability distribution on the space of all possible functions, so that the reconstruction task is formulated as an estimation problem. Our main contributions are the following: (1) We introduce the use of specific error criteria for the design of the optimal Bayesian estimators for several classes of problems, and propose a general (Monte Carlo) procedure for approximating them. This new approach leads to a substantial improvement over the existing schemes, both regarding the quality of the results (particularly for low signal to noise ratios) and the computational efficiency. (2) We apply the Bayesian appraoch to the solution of several problems, some of which are formulated and solved in these terms for the first time. Specifically, these applications are: teh reconstruction of piecewise constant surfaces from sparse and noisy observationsl; the reconstruction of depth from stereoscopic pairs of images and the formation of perceptual clusters. (3) For each one of these applications, we develop fast, deterministic algorithms that approximate the optimal estimators, and illustrate their performance on both synthetic and real data. (4) We propose a new method, based on the analysis of the residual process, for estimating the parameters of the probabilistic models directly from the noisy observations. This scheme leads to an algorithm, which has no free parameters, for the restoration of piecewise uniform images. (5) We analyze the implementation of the algorithms that we develop in non-conventional hardware, such as massively parallel digital machines, and analog and hybrid networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scheduling problem in distributed data-intensive computing environments has become an active research topic due to the tremendous growth in grid and cloud computing environments. As an innovative distributed intelligent paradigm, swarm intelligence provides a novel approach to solving these potentially intractable problems. In this paper, we formulate the scheduling problem for work-flow applications with security constraints in distributed data-intensive computing environments and present a novel security constraint model. Several meta-heuristic adaptations to the particle swarm optimization algorithm are introduced to deal with the formulation of efficient schedules. A variable neighborhood particle swarm optimization algorithm is compared with a multi-start particle swarm optimization and multi-start genetic algorithm. Experimental results illustrate that population based meta-heuristics approaches usually provide a good balance between global exploration and local exploitation and their feasibility and effectiveness for scheduling work-flow applications. © 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual salience is an intriguing phenomenon observed in biological neural systems. Numerous attempts have been made to model visual salience mathematically using various feature contrasts, either locally or globally. However, these algorithmic models tend to ignore the problem’s biological solutions, in which visual salience appears to arise during the propagation of visual stimuli along the visual cortex. In this paper, inspired by the conjecture that salience arises from deep propagation along the visual cortex, we present a Deep Salience model where a multi-layer model based on successive Markov random fields (sMRF) is proposed to analyze the input image successively through its deep belief propagation. As a result, the foreground object can be automatically separated from the background in a fully unsupervised way. Experimental evaluation on the benchmark dataset validated that our Deep Salience model can consistently outperform eleven state-of-the-art salience models, yielding the higher rates in the precision-recall tests and attaining the best F-measure and mean-square error in the experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Super-resolution refers to the process of obtaining a high resolution image from one or more low resolution images. In this work, we present a novel method for the super-resolution problem for the limited case, where only one image of low resolution is given as an input. The proposed method is based on statistical learning for inferring the high frequencies regions which helps to distinguish a high resolution image from a low resolution one. These inferences are obtained from the correlation between regions of low and high resolution that come exclusively from the image to be super-resolved, in term of small neighborhoods. The Markov random fields are used as a model to capture the local statistics of high and low resolution data when they are analyzed at different scales and resolutions. Experimental results show the viability of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'un des modèles d'apprentissage non-supervisé générant le plus de recherche active est la machine de Boltzmann --- en particulier la machine de Boltzmann restreinte, ou RBM. Un aspect important de l'entraînement ainsi que l'exploitation d'un tel modèle est la prise d'échantillons. Deux développements récents, la divergence contrastive persistante rapide (FPCD) et le herding, visent à améliorer cet aspect, se concentrant principalement sur le processus d'apprentissage en tant que tel. Notamment, le herding renonce à obtenir un estimé précis des paramètres de la RBM, définissant plutôt une distribution par un système dynamique guidé par les exemples d'entraînement. Nous généralisons ces idées afin d'obtenir des algorithmes permettant d'exploiter la distribution de probabilités définie par une RBM pré-entraînée, par tirage d'échantillons qui en sont représentatifs, et ce sans que l'ensemble d'entraînement ne soit nécessaire. Nous présentons trois méthodes: la pénalisation d'échantillon (basée sur une intuition théorique) ainsi que la FPCD et le herding utilisant des statistiques constantes pour la phase positive. Ces méthodes définissent des systèmes dynamiques produisant des échantillons ayant les statistiques voulues et nous les évaluons à l'aide d'une méthode d'estimation de densité non-paramétrique. Nous montrons que ces méthodes mixent substantiellement mieux que la méthode conventionnelle, l'échantillonnage de Gibbs.