19 resultados para Decoding principle

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We employ two different methods, based on belief propagation and TAP,for decoding corrupted messages encoded by employing Sourlas's method, where the code word comprises products of K bits selected randomly from the original message. We show that the equations obtained by the two approaches are similar and provide the same solution as the one obtained by the replica approach in some cases K=2. However, we also show that for K>=3 and unbiased messages the iterative solution is sensitive to the initial conditions and is likely to provide erroneous solutions; and that it is generally beneficial to use Nishimori's temperature, especially in the case of biased messages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical physics is employed to evaluate the performance of error-correcting codes in the case of finite message length for an ensemble of Gallager's error correcting codes. We follow Gallager's approach of upper-bounding the average decoding error rate, but invoke the replica method to reproduce the tightest general bound to date, and to improve on the most accurate zero-error noise level threshold reported in the literature. The relation between the methods used and those presented in the information theory literature are explored.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of "typical set (pairs) decoding" for ensembles of Gallager's linear code is investigated using statistical physics. In this decoding method, errors occur, either when the information transmission is corrupted by atypical noise, or when multiple typical sequences satisfy the parity check equation as provided by the received corrupted codeword. We show that the average error rate for the second type of error over a given code ensemble can be accurately evaluated using the replica method, including the sensitivity to message length. Our approach generally improves the existing analysis known in the information theory community, which was recently reintroduced in IEEE Trans. Inf. Theory 45, 399 (1999), and is believed to be the most accurate to date. © 2002 The American Physical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We determine the critical noise level for decoding low density parity check error correcting codes based on the magnetization enumerator , rather than on the weight enumerator employed in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. In addition, our analysis provides an explanation for the difference in performance between MN and Gallager codes. Our results are more optimistic than those derived via the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analysis and prediction of the dynamic behaviour of s7ructural components plays an important role in modern engineering design. :n this work, the so-called "mixed" finite element models based on Reissnen's variational principle are applied to the solution of free and forced vibration problems, for beam and :late structures. The mixed beam models are obtained by using elements of various shape functions ranging from simple linear to complex cubic and quadratic functions. The elements were in general capable of predicting the natural frequencies and dynamic responses with good accuracy. An isoparametric quadrilateral element with 8-nodes was developed for application to thin plate problems. The element has 32 degrees of freedom (one deflection, two bending and one twisting moment per node) which is suitable for discretization of plates with arbitrary geometry. A linear isoparametric element and two non-conforming displacement elements (4-node and 8-node quadrilateral) were extended to the solution of dynamic problems. An auto-mesh generation program was used to facilitate the preparation of input data required by the 8-node quadrilateral elements of mixed and displacement type. Numerical examples were solved using both the mixed beam and plate elements for predicting a structure's natural frequencies and dynamic response to a variety of forcing functions. The solutions were compared with the available analytical and displacement model solutions. The mixed elements developed have been found to have significant advantages over the conventional displacement elements in the solution of plate type problems. A dramatic saving in computational time is possible without any loss in solution accuracy. With beam type problems, there appears to be no significant advantages in using mixed models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In response to the increasing international competitiveness, many manufacturing businesses are rethinking their management strategies and philosophies towards achieving a computer integrated environment. The explosive growth in Advanced Manufacturing Technology (AMI) has resulted in the formation of functional "Islands of Automation" such as Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), Computer Aided Process Planning (CAPP) and Manufacturing Resources Planning (MRPII). This has resulted in an environment which has focussed areas of excellence and poor overall efficiency, co-ordination and control. The main role of Computer Integrated Manufacturing (CIM) is to integrate these islands of automation and develop a totally integrated and controlled environment. However, the various perceptions of CIM, although developing, remain focussed on a very narrow integration scope and have consequently resulted in mere linked islands of automation with little improvement in overall co-ordination and control. This thesis, that is the research described within, develops and examines a more holistic view of CIM, which is based on the integration of various business elements. One particular business element, namely control, has been shown to have a multi-facetted and underpinning relationship with the CIM philosophy. This relationship impacts various CIM system design aspects including the CIM business analysis and modelling technique, the specification of systems integration requirements, the CIM system architectural form and the degree of business redesign. The research findings show that fundamental changes to CIM system design are required; these are incorporated in a generic CIM design methodology. The affect and influence of this holistic view of CIM on a manufacturing business has been evaluated through various industrial case study applications. Based on the evidence obtained, it has been concluded that this holistic, control based approach to CIM can provide a greatly improved means of achieving a totally integrated and controlled business environment. This generic CIM methodology will therefore make a significant contribution to the planning, modelling, design and development of future CIM systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developmental dyslexia is associated with deficits in the processing of basic auditory stimuli. Yet it is unclear how these sensory impairments might contribute to poor reading skills. This study better characterizes the relationship between phonological decoding skills, the lack of which is generally accepted to comprise the core deficit in reading disabilities, and auditory sensitivity to amplitude modulation (AM) and frequency modulation (FM). Thirty-eight adult subjects, 17 of whom had a history of developmental dyslexia, completed a battery, of psychophysical measures of sensitivity to FM and AM at different modulation rates, along with a measure of pseudoword reading accuracy and standardized assessments of literacy and cognitive skills. The subjects with a history of dyslexia were significantly less sensitive than controls to 2-Hz FM and 20-Hz AM only. The absence of a significant group difference for 2-Hz AM shows that the dyslexics do not have a general deficit in detecting all slow modulations. Thresholds for detecting 2-Hz and 240-Hz FM and 20-Hz AM correlated significantly with pseudoword reading accuracy. After accounting for various cognitive skills, however, multiple regression analyses showed that detection thresholds for both 2-Hz FM and 20-Hz AM were significant and independent predictors of pseudoword reading ability in the entire sample. Thresholds for 2-Hz AM and 240-Hz FM did not explain significant additional variance in pseudoword reading skill, it is therefore possible that certain components of auditory processing of modulations are related to phonological decoding skills, whereas others are not.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we will demonstrate the improved BER performance of doubly differential phase shift keying in a coherent optical packet switching scenario while still retaining the benefits of high frequency offset tolerance. © OSA 2014.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A wavelength converter for 10.7 Gbit/s DPSK signals is presented which eliminates the need to use additional encoders or path-dependent encoding/decoding. It has a small penalty of 1.8 dB at an OSNR of 24.8 dB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new generalized sphere decoding algorithm is proposed for underdetermined MIMO systems with fewer receive antennas N than transmit antennas M. The proposed algorithm is significantly faster than the existing generalized sphere decoding algorithms. The basic idea is to partition the transmitted signal vector into two subvectors x and x with N - 1 and M - N + 1 elements respectively. After some simple transformations, an outer layer Sphere Decoder (SD) can be used to choose proper x and then use an inner layer SD to decide x, thus the whole transmitted signal vector is obtained. Simulation results show that Double Layer Sphere Decoding (DLSD) has far less complexity than the existing Generalized Sphere Decoding (GSDs).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We determine the critical noise level for decoding low-density parity check error-correcting codes based on the magnetization enumerator (M), rather than on the weight enumerator (W) employed in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. In addition, our analysis provides an explanation for the difference in performance between MN and Gallager codes. Our results are more optimistic than those derived using the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article investigates metaphors of identity in dating ads and in two types of newspaper writing, 'hard' and 'soft' news articles. It focuses on issues of textualization and processing, and particularly on the role of cotext in decoding metaphors. Taking a pragmatic approach founded in the cooperative principle, it argues that the maxims of quality and relation play related but separable roles in the interpretation of identity metaphors; and that this process is guided and constrained by cotextual selections in the environment of the metaphorical term. The particular kinds of cotextual guidance provided by the writer are seen to vary according to genre-driven issues. These include the purpose and stylistic conventions of the genre in which the metaphor occurs and the circumstances under which the text is composed and read. Differing functional motivations are suggested for the use of identity metaphors in each of the genres considered. © Walter de Gruyter 2007.