36 resultados para Algorithmic information theory

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present information-theory analysis of the tradeoff between bit-error rate improvement and the data-rate loss using skewed channel coding to suppress pattern-dependent errors in digital communications. Without loss of generality, we apply developed general theory to the particular example of a high-speed fiber communication system with a strong patterning effect. © 2007 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we summarize our recently proposed work on the information theory analysis of regenerative channels. We discuss how the design and the transfer function properties of the regenerator affect the noise statistics and enable Shannon capacities higher than that of the corresponding linear channels (in the absence of regeneration).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This Thesis addresses the problem of automated false-positive free detection of epileptic events by the fusion of information extracted from simultaneously recorded electro-encephalographic (EEG) and the electrocardiographic (ECG) time-series. The approach relies on a biomedical case for the coupling of the Brain and Heart systems through the central autonomic network during temporal lobe epileptic events: neurovegetative manifestations associated with temporal lobe epileptic events consist of alterations to the cardiac rhythm. From a neurophysiological perspective, epileptic episodes are characterised by a loss of complexity of the state of the brain. The description of arrhythmias, from a probabilistic perspective, observed during temporal lobe epileptic events and the description of the complexity of the state of the brain, from an information theory perspective, are integrated in a fusion-of-information framework towards temporal lobe epileptic seizure detection. The main contributions of the Thesis include the introduction of a biomedical case for the coupling of the Brain and Heart systems during temporal lobe epileptic seizures, partially reported in the clinical literature; the investigation of measures for the characterisation of ictal events from the EEG time series towards their integration in a fusion-of-knowledge framework; the probabilistic description of arrhythmias observed during temporal lobe epileptic events towards their integration in a fusion-of-knowledge framework; and the investigation of the different levels of the fusion-of-information architecture at which to perform the combination of information extracted from the EEG and ECG time-series. The performance of the method designed in the Thesis for the false-positive free automated detection of epileptic events achieved a false-positives rate of zero on the dataset of long-term recordings used in the Thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the agrifood sector, the explosive increase in information about environmental sustainability, often in uncoordinated information systems, has created a new form of ignorance ('meta-ignorance') that diminishes the effectiveness of information on decision-makers. Flows of information are governed by informal and formal social arrangements that we can collectively call Informational Institutions. In this paper, we have reviewed the recent literature on such institutions. From the perspectives of information theory and new institutional economics, current informational institutions are increasing the information entropy of communications concerning environmental sustainability and stakeholders' transaction costs of using relevant information. In our view this reduces the effectiveness of informational governance. Future research on informational governance should explicitly address these aspects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical physics is employed to evaluate the performance of error-correcting codes in the case of finite message length for an ensemble of Gallager's error correcting codes. We follow Gallager's approach of upper-bounding the average decoding error rate, but invoke the replica method to reproduce the tightest general bound to date, and to improve on the most accurate zero-error noise level threshold reported in the literature. The relation between the methods used and those presented in the information theory literature are explored.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The performance of "typical set (pairs) decoding" for ensembles of Gallager's linear code is investigated using statistical physics. In this decoding method, errors occur, either when the information transmission is corrupted by atypical noise, or when multiple typical sequences satisfy the parity check equation as provided by the received corrupted codeword. We show that the average error rate for the second type of error over a given code ensemble can be accurately evaluated using the replica method, including the sensitivity to message length. Our approach generally improves the existing analysis known in the information theory community, which was recently reintroduced in IEEE Trans. Inf. Theory 45, 399 (1999), and is believed to be the most accurate to date. © 2002 The American Physical Society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a method to determine the critical noise level for decoding Gallager type low density parity check error correcting codes. The method is based on the magnetization enumerator (¸M), rather than on the weight enumerator (¸W) presented recently in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. Our results are more optimistic than those derived via the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We determine the critical noise level for decoding low density parity check error correcting codes based on the magnetization enumerator , rather than on the weight enumerator employed in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. In addition, our analysis provides an explanation for the difference in performance between MN and Gallager codes. Our results are more optimistic than those derived via the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The modem digital communication systems are made transmission reliable by employing error correction technique for the redundancies. Codes in the low-density parity-check work along the principles of Hamming code, and the parity-check matrix is very sparse, and multiple errors can be corrected. The sparseness of the matrix allows for the decoding process to be carried out by probability propagation methods similar to those employed in Turbo codes. The relation between spin systems in statistical physics and digital error correcting codes is based on the existence of a simple isomorphism between the additive Boolean group and the multiplicative binary group. Shannon proved general results on the natural limits of compression and error-correction by setting up the framework known as information theory. Error-correction codes are based on mapping the original space of words onto a higher dimensional space in such a way that the typical distance between encoded words increases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multispin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel approach, based on statistical mechanics, to analyze typical performance of optimum code-division multiple-access (CDMA) multiuser detectors is reviewed. A `black-box' view ot the basic CDMA channel is introduced, based on which the CDMA multiuser detection problem is regarded as a `learning-from-examples' problem of the `binary linear perceptron' in the neural network literature. Adopting Bayes framework, analysis of the performance of the optimum CDMA multiuser detectors is reduced to evaluation of the average of the cumulant generating function of a relevant posterior distribution. The evaluation of the average cumulant generating function is done, based on formal analogy with a similar calculation appearing in the spin glass theory in statistical mechanics, by making use of the replica method, a method developed in the spin glass theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a method based on the magnetization enumerator to determine the critical noise level for Gallager type low density parity check error correcting codes (LDPC). Our method provides an appealingly simple interpretation to the relation between different decoding schemes, and provides more optimistic critical noise levels than those reported in the information theory literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The replica method, developed in statistical physics, is employed in conjunction with Gallager's methodology to accurately evaluate zero error noise thresholds for Gallager code ensembles. Our approach generally provides more optimistic evaluations than those reported in the information theory literature for sparse matrices; the difference vanishes as the parity check matrix becomes dense.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An analytical framework to analyze the stage-by-stage detection dynamics of the multistage CDMA multiuser detector is presented. The density evolution idea is applied to analyze the multistage detector. Message distribution is treated basically by a Gaussian approximation, but interstage correlation of messages is systematically taken into account, which turns out to provide significant improvement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We analyze, using the replica method of statistical mechanics, the theoretical performance of coded code-division multiple-access (CDMA) systems in which regular low-density parity-check (LDPC) codes are used for channel coding.