993 resultados para Coding theory
Resumo:
This dissertation concerns the intersection of three areas of discrete mathematics: finite geometries, design theory, and coding theory. The central theme is the power of finite geometry designs, which are constructed from the points and t-dimensional subspaces of a projective or affine geometry. We use these designs to construct and analyze combinatorial objects which inherit their best properties from these geometric structures. A central question in the study of finite geometry designs is Hamada’s conjecture, which proposes that finite geometry designs are the unique designs with minimum p-rank among all designs with the same parameters. In this dissertation, we will examine several questions related to Hamada’s conjecture, including the existence of counterexamples. We will also study the applicability of certain decoding methods to known counterexamples. We begin by constructing an infinite family of counterexamples to Hamada’s conjecture. These designs are the first infinite class of counterexamples for the affine case of Hamada’s conjecture. We further demonstrate how these designs, along with the projective polarity designs of Jungnickel and Tonchev, admit majority-logic decoding schemes. The codes obtained from these polarity designs attain error-correcting performance which is, in certain cases, equal to that of the finite geometry designs from which they are derived. This further demonstrates the highly geometric structure maintained by these designs. Finite geometries also help us construct several types of quantum error-correcting codes. We use relatives of finite geometry designs to construct infinite families of q-ary quantum stabilizer codes. We also construct entanglement-assisted quantum error-correcting codes (EAQECCs) which admit a particularly efficient and effective error-correcting scheme, while also providing the first general method for constructing these quantum codes with known parameters and desirable properties. Finite geometry designs are used to give exceptional examples of these codes.
Resumo:
Quantum Key Distribution is carving its place among the tools used to secure communications. While a difficult technology, it enjoys benefits that set it apart from the rest, the most prominent is its provable security based on the laws of physics. QKD requires not only the mastering of signals at the quantum level, but also a classical processing to extract a secret-key from them. This postprocessing has been customarily studied in terms of the efficiency, a figure of merit that offers a biased view of the performance of real devices. Here we argue that it is the throughput the significant magnitude in practical QKD, specially in the case of high speed devices, where the differences are more marked, and give some examples contrasting the usual postprocessing schemes with new ones from modern coding theory. A good understanding of its implications is very important for the design of modern QKD devices.
Resumo:
Este trabajo trata de la aplicación de los códigos detectores y correctores de error al diseño de los Computadores Tolerantes a Fallos, planteando varias estrategias óptimas de detección y corrección para algunos subsistemas. En primer lugar,"se justifica la necesidad de aplicar técnicas de Tolerancia a Fallos. A continuación se hacen previsiones de evolución de la tecnología de Integración, así como una tipificación de los fallos en circuitos Integrados. Partiendo de una recopilación y revisión de la teoría de códigos, se hace un desarrollo teórico cuya aplicación permite obligar a que algunos de estos códigos sean cerrados respecto de las operaciones elementales que se ejecutan en un computador. Se plantean estrategias óptimas de detección y corrección de error para sus subsistemas mas Importantes, culminando en el diseño, realización y prueba de una unidad de memoria y una unidad de proceso de datos con amplias posibilidades de detección y corrección de errores.---ABSTRACT---The present work deals with the application of error detecting and correctíng codes to the désign of Fault Tolerant Computers. Several óptimo» detection and correction strategies are presented to be applied in some subsystems. First of all, the necessity of applying Fault Tolerant techniques is explained. Later, a study on íntegration technology evolution and typification of Integrated circuit faults 1s developed. Based on a compilation and revisión of Coding Theory, a theoretical study is carried out. It allows us to force some of these codes to be closed over elementary operations. Optimum detection and correction techniques are presented for the raost important subsystems. Flnally, the design, building and testing of a memory unit and a processing unit provided with wlde error detection and correction posibilities 1s shown.
Resumo:
Technological progress in the area of informatics and human interface platforms create a window of opportunities for the neurorehablitation of patients with motor impairments. The CogWatch project (www.cogwatch.eu) aims to create an intelligent assistance system to improve motor planning and execution in patients with apraxia during their daily activities. Due to the brain damage caused by cardiovascular incident these patients suffer from impairments in the ability to use tools, and to sequence actions during daily tasks (such as making breakfast). Based on the common coding theory (Hommel et al., 2001) and mirror neuron primate research (Rizzolatti et al., 2001) we aim to explore use of cues, which incorporate aspects of biological motion from healthy adults performing everyday tasks requiring tool use and ecological sounds linked to the action goal. We hypothesize that patients with apraxia will benefit from supplementary sensory information relevant to the task, which will reinforce the selection of the appropriate motor plan. Findings from this study determine the type of sensory guidance in the CogWatch interface. Rationale for the experimental design is presented and the relevant literature is discussed.
Resumo:
Efficient hardware implementations of arithmetic operations in the Galois field are highly desirable for several applications, such as coding theory, computer algebra and cryptography. Among these operations, multiplication is of special interest because it is considered the most important building block. Therefore, high-speed algorithms and hardware architectures for computing multiplication are highly required. In this paper, bit-parallel polynomial basis multipliers over the binary field GF(2(m)) generated using type II irreducible pentanomials are considered. The multiplier here presented has the lowest time complexity known to date for similar multipliers based on this type of irreducible pentanomials.
Resumo:
Vita.
Resumo:
Bibliography: p. 23.
Resumo:
Thesis (M. S.)--University of Illinois at Urbana-Champaign.
Resumo:
Thesis--University of Illinois.
Resumo:
Includes bibliography.
Resumo:
In this paper we present a developed software in the area of Coding Theory. Using it, codes with given properties can be classified. A part of this software can be used also for investigations (isomorphisms, automorphism groups) of other discrete structures-combinatorial designs, Hadamard matrices, bipartite graphs etc.
Resumo:
*Partially supported by NATO.
Resumo:
The problem of efficient computing of the affine vector operations (addition of two vectors and multiplication of a vector by a scalar over GF (q)), and also the weight of a given vector, is important for many problems in coding theory, cryptography, VLSI technology etc. In this paper we propose a new way of representing vectors over GF (3) and GF (4) and we describe an efficient performance of these affine operations. Computing weights of binary vectors is also discussed.
Resumo:
Forward error correction (FEC) plays a vital role in coherent optical systems employing multi-level modulation. However, much of coding theory assumes that additive white Gaussian noise (AWGN) is dominant, whereas coherent optical systems have significant phase noise (PN) in addition to AWGN. This changes the error statistics and impacts FEC performance. In this paper, we propose a novel semianalytical method for dimensioning binary Bose-Chaudhuri-Hocquenghem (BCH) codes for systems with PN. Our method involves extracting statistics from pre-FEC bit error rate (BER) simulations. We use these statistics to parameterize a bivariate binomial model that describes the distribution of bit errors. In this way, we relate pre-FEC statistics to post-FEC BER and BCH codes. Our method is applicable to pre-FEC BER around 10-3 and any post-FEC BER. Using numerical simulations, we evaluate the accuracy of our approach for a target post-FEC BER of 10-5. Codes dimensioned with our bivariate binomial model meet the target within 0.2-dB signal-to-noise ratio.
Resumo:
Chinese-English bilingual students were randomly assigned to three reading conditions: In the English-English (E-E) condition (n = 44), a text in English was read twice; in the English-Chinese (E-C) condition (n = 30), the English text was read first and its Chinese translation was read second; in the Chinese-English (C-E) condition (n = 30), the Chinese text was read first and English second. An expected explicit memory test on propositions in the format of sentence verification was given followed by an unexpected implicit memory test on unfamiliar word-forms.^ Analyses of covariance were conducted with explicit and implicit memory scores as the dependent variables, reading condition (bilingual versus monolingual) as the independent variable, and TOEFL reading score as the covariate.^ The results showed that the bilingual reading groups outperformed the monolingual reading group on explicit memory tested by sentence-verification but not on implicit memory tested by forced-choice word-identification, implying that bilingual representation facilitates explicit memory of propositional information but not implicit memory of lexical forms. The findings were interpreted as consistent with separate bilingual memory-storage models and the implications of such models in the study of cognitive structures were discussed in relationship to issues of dual coding theory, multiple memory systems, and the linguistic relativity philosophy. ^