72 resultados para Probabilistic Reasoning
Resumo:
A computer can assist the process of design by analogy by recording past designs. The experience these represent could be much wider than that of designers using the system, who therefore need to identify potential cases of interest. If the computer assists with this lookup, the designers can concentrate on the more interesting aspect of extracting and using the ideas which are found. However, as the knowledge base grows it becomes ever harder to find relevant cases using a keyword indexing scheme without knowing precisely what to look for. Therefore a more flexible searching system is needed.
If a similarity measure can be defined for the features of the designs, then it is possible to match and cluster them. Using a simple measure like co-occurrence of features within a particular case would allow this to happen without human intervention, which is tedious and time- consuming. Any knowledge that is acquired about how features are related to each other will be very shallow: it is not intended as a cognitive model for how humans understand, learn, or retrieve information, but more an attempt to make effective, efficient use of the information available. The question remains of whether such shallow knowledge is sufficient for the task.
A system to retrieve information from a large database is described. It uses co-occurrences to relate keywords to each other, and then extends search queries with similar words. This seems to make relevant material more accessible, providing hope that this retrieval technique can be applied to a broader knowledge base.
Resumo:
We propose a novel model for the spatio-temporal clustering of trajectories based on motion, which applies to challenging street-view video sequences of pedestrians captured by a mobile camera. A key contribution of our work is the introduction of novel probabilistic region trajectories, motivated by the non-repeatability of segmentation of frames in a video sequence. Hierarchical image segments are obtained by using a state-of-the-art hierarchical segmentation algorithm, and connected from adjacent frames in a directed acyclic graph. The region trajectories and measures of confidence are extracted from this graph using a dynamic programming-based optimisation. Our second main contribution is a Bayesian framework with a twofold goal: to learn the optimal, in a maximum likelihood sense, Random Forests classifier of motion patterns based on video features, and construct a unique graph from region trajectories of different frames, lengths and hierarchical levels. Finally, we demonstrate the use of Isomap for effective spatio-temporal clustering of the region trajectories of pedestrians. We support our claims with experimental results on new and existing challenging video sequences. © 2011 IEEE.
Resumo:
A number of recent scientific and engineering problems require signals to be decomposed into a product of a slowly varying positive envelope and a quickly varying carrier whose instantaneous frequency also varies slowly over time. Although signal processing provides algorithms for so-called amplitude-and frequency-demodulation (AFD), there are well known problems with all of the existing methods. Motivated by the fact that AFD is ill-posed, we approach the problem using probabilistic inference. The new approach, called probabilistic amplitude and frequency demodulation (PAFD), models instantaneous frequency using an auto-regressive generalization of the von Mises distribution, and the envelopes using Gaussian auto-regressive dynamics with a positivity constraint. A novel form of expectation propagation is used for inference. We demonstrate that although PAFD is computationally demanding, it outperforms previous approaches on synthetic and real signals in clean, noisy and missing data settings.
Resumo:
The diversity of non-domestic buildings at urban scale poses a number of difficulties to develop models for large scale analysis of the stock. This research proposes a probabilistic, engineering-based, bottom-up model to address these issues. In a recent study we classified London's non-domestic buildings based on the service they provide, such as offices, retail premise, and schools, and proposed the creation of one probabilistic representational model per building type. This paper investigates techniques for the development of such models. The representational model is a statistical surrogate of a dynamic energy simulation (ES) model. We first identify the main parameters affecting energy consumption in a particular building sector/type by using sampling-based global sensitivity analysis methods, and then generate statistical surrogate models of the dynamic ES model within the dominant model parameters. Given a sample of actual energy consumption for that sector, we use the surrogate model to infer the distribution of model parameters by inverse analysis. The inferred distributions of input parameters are able to quantify the relative benefits of alternative energy saving measures on an entire building sector with requisite quantification of uncertainties. Secondary school buildings are used for illustrating the application of this probabilistic method. © 2012 Elsevier B.V. All rights reserved.
Resumo:
We present algorithms for tracking and reasoning of local traits in the subsystem level based on the observed emergent behavior of multiple coordinated groups in potentially cluttered environments. Our proposed Bayesian inference schemes, which are primarily based on (Markov chain) Monte Carlo sequential methods, include: 1) an evolving network-based multiple object tracking algorithm that is capable of categorizing objects into groups, 2) a multiple cluster tracking algorithm for dealing with prohibitively large number of objects, and 3) a causality inference framework for identifying dominant agents based exclusively on their observed trajectories.We use these as building blocks for developing a unified tracking and behavioral reasoning paradigm. Both synthetic and realistic examples are provided for demonstrating the derived concepts. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
There are many methods for decomposing signals into a sum of amplitude and frequency modulated sinusoids. In this paper we take a new estimation based approach. Identifying the problem as ill-posed, we show how to regularize the solution by imposing soft constraints on the amplitude and phase variables of the sinusoids. Estimation proceeds using a version of Kalman smoothing. We evaluate the method on synthetic and natural, clean and noisy signals, showing that it outperforms previous decompositions, but at a higher computational cost. © 2012 IEEE.
Resumo:
Modelling is fundamental to many fields of science and engineering. A model can be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian non-parametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian non-parametrics. The survey covers the use of Bayesian non-parametrics for modelling unknown functions, density estimation, clustering, time-series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically, it gives brief non-technical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman's coalescent, Dirichlet diffusion trees and Wishart processes.
Resumo:
A key function of the brain is to interpret noisy sensory information. To do so optimally, observers must, in many tasks, take into account knowledge of the precision with which stimuli are encoded. In an orientation change detection task, we find that encoding precision does not only depend on an experimentally controlled reliability parameter (shape), but also exhibits additional variability. In spite of variability in precision, human subjects seem to take into account precision near-optimally on a trial-to-trial and item-to-item basis. Our results offer a new conceptualization of the encoding of sensory information and highlight the brain's remarkable ability to incorporate knowledge of uncertainty during complex perceptual decision-making.