18 resultados para information bottleneck method

em Cambridge University Engineering Department Publications Database


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the guiding principles of sensory coding strategies is a main goal in computational neuroscience. Among others, the principles of predictive coding and slowness appear to capture aspects of sensory processing. Predictive coding postulates that sensory systems are adapted to the structure of their input signals such that information about future inputs is encoded. Slow feature analysis (SFA) is a method for extracting slowly varying components from quickly varying input signals, thereby learning temporally invariant features. Here, we use the information bottleneck method to state an information-theoretic objective function for temporally local predictive coding. We then show that the linear case of SFA can be interpreted as a variant of predictive coding that maximizes the mutual information between the current output of the system and the input signal in the next time step. This demonstrates that the slowness principle and predictive coding are intimately related.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of blind multiuser detection. We adopt a Bayesian approach where unknown parameters are considered random and integrated out. Computing the maximum a posteriori estimate of the input data sequence requires solving a combinatorial optimization problem. We propose here to apply the Cross-Entropy method recently introduced by Rubinstein. The performance of cross-entropy is compared to Markov chain Monte Carlo. For similar Bit Error Rate performance, we demonstrate that Cross-Entropy outperforms a generic Markov chain Monte Carlo method in terms of operation time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequential Monte Carlo (SMC) methods are popular computational tools for Bayesian inference in non-linear non-Gaussian state-space models. For this class of models, we propose SMC algorithms to compute the score vector and observed information matrix recursively in time. We propose two different SMC implementations, one with computational complexity $\mathcal{O}(N)$ and the other with complexity $\mathcal{O}(N^{2})$ where $N$ is the number of importance sampling draws. Although cheaper, the performance of the $\mathcal{O}(N)$ method degrades quickly in time as it inherently relies on the SMC approximation of a sequence of probability distributions whose dimension is increasing linearly with time. In particular, even under strong \textit{mixing} assumptions, the variance of the estimates computed with the $\mathcal{O}(N)$ method increases at least quadratically in time. The $\mathcal{O}(N^{2})$ is a non-standard SMC implementation that does not suffer from this rapid degrade. We then show how both methods can be used to perform batch and recursive parameter estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Hybrid method (FE + SEA) it is possible to estimate the frequency response of an uncertain structure. The current work develops the Hybrid method to allow for time domain analysis of the shock response of a structure. Problems to be overcome when taking Hybrid method results into the time domain are a) the Hybrid method frequency response has no phase information, and b) the Hybrid method frequency response is smoothed in frequency and shows no modal peaks. In this paper the first problem has been overcome, using minimum phase reconstruction. Explanation of minimum phase reconstruction and its limitations are described, and application to shock problems described. © 2009 IOP Publishing Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper extends a state projection method for structure preserving model reduction to situations where only a weaker notion of system structure is available. This weaker notion of structure, identifying the causal relationship between manifest variables of the system, is especially relevant is settings such as systems biology, where a clear partition of state variables into distinct subsystems may be unknown, or not even exist. The resulting technique, like similar approaches, does not provide theoretical performance guarantees, so an extensive computational study is conducted, and it is observed to work fairly well in practice. Moreover, conditions characterizing structurally minimal realizations and sufficient conditions characterizing edge loss resulting from the reduction process, are presented. ©2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obtaining accurate confidence measures for automatic speech recognition (ASR) transcriptions is an important task which stands to benefit from the use of multiple information sources. This paper investigates the application of conditional random field (CRF) models as a principled technique for combining multiple features from such sources. A novel method for combining suitably defined features is presented, allowing for confidence annotation using lattice-based features of hypotheses other than the lattice 1-best. The resulting framework is applied to different stages of a state-of-the-art large vocabulary speech recognition pipeline, and consistent improvements are shown over a sophisticated baseline system. Copyright © 2011 ISCA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision-making at the front-end of innovation is critical for the success of companies. This paper presents a simple visual method, called DMCA (Decision-Making Criteria Assessment), which was created to clarify and improve decision-making at the front-end of innovation. The method maps the uncertainty of project information and importance of decision criteria, compiling a measure that indicates whether the decision is highly uncertain, what information interferes with it, and what criteria are actually being considered. The DMCA method was tested in two projects that faced decision-making issues, and the results confirm the benefits of using this method in decision-making at the front-end. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nervous system implements a networked control system in which the plants take the form of limbs, the controller is the brain, and neurons form the communication channels. Unlike standard networked control architectures, there is no periodic sampling, and the fundamental units of communication contain little numerical information. This paper describes a novel communication channel, modeled after spiking neurons, in which the transmitter integrates an input signal and sends out a spike when the integral reaches a threshold value. The reciever then filters the sequence of spikes to approximately reconstruct the input signal. It is shown that for appropriate choices of channel parameters, stable feedback control over these spiking channels is possible. Furthermore, good tracking performance can be achieved. The data rate of the channel increases linearly with the size of the inputs. Thus, when placed in a feedback loop, small loop gains imply a low data rate. ©2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian processes are gaining increasing popularity among the control community, in particular for the modelling of discrete time state space systems. However, it has not been clear how to incorporate model information, in the form of known state relationships, when using a Gaussian process as a predictive model. An obvious example of known prior information is position and velocity related states. Incorporation of such information would be beneficial both computationally and for faster dynamics learning. This paper introduces a method of achieving this, yielding faster dynamics learning and a reduction in computational effort from O(Dn2) to O((D - F)n2) in the prediction stage for a system with D states, F known state relationships and n observations. The effectiveness of the method is demonstrated through its inclusion in the PILCO learning algorithm with application to the swing-up and balance of a torque-limited pendulum and the balancing of a robotic unicycle in simulation. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method for analysing the operational complexity in supply chains by using an entropic measure based on information theory. The proposed approach estimates the operational complexity at each stage of the supply chain and analyses the changes between stages. In this paper a stage is identified by the exchange of data and/or material. Through analysis the method identifies the stages where the operational complexity is both generated and propagated (exported, imported, generated or absorbed). Central to the method is the identification of a reference point within the supply chain. This is where the operational complexity is at a local minimum along the data transfer stages. Such a point can be thought of as a 'sink' for turbulence generated in the supply chain. Where it exists, it has the merit of stabilising the supply chain by attenuating uncertainty. However, the location of the reference point is also a matter of choice. If the preferred location is other than the current one, this is a trigger for management action. The analysis can help decide appropriate remedial action. More generally, the approach can assist logistics management by highlighting problem areas. An industrial application is presented to demonstrate the applicability of the method. © 2013 Operational Research Society Ltd. All rights reserved.