955 resultados para FEC using Reed-Solomon-like codes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

SomeCast is a novel paradigm for the reliable multicast of real-time data to a large set of receivers over the Internet. SomeCast is receiver-initiated and thus scalable in the number of receivers, the diverse characteristics of paths between senders and receivers (e.g. maximum bandwidth and round-trip-time), and the dynamic conditions of such paths (e.g. congestion-induced delays and losses). SomeCast enables receivers to dynamically adjust the rate at which they receive multicast information to enable the satisfaction of real-time QoS constraints (e.g. rate, deadlines, or jitter). This is done by enabling a receiver to join SOME number of concurrent multiCAST sessions, whereby each session delivers a portion of an encoding of the real-time data. By adjusting the number of such sessions dynamically, client-specific QoS constraints can be met independently. The SomeCast paradigm can be thought of as a generalization of the AnyCast (e.g. Dynamic Server Selection) and ManyCast (e.g. Digital Fountain) paradigms, which have been proposed in the literature to address issues of scalability of UniCast and MultiCast environments, respectively. In this paper we overview the SomeCast paradigm, describe an instance of a SomeCast protocol, and present simulation results that quantify the significant advantages gained from adopting such a protocol for the reliable multicast of data to a diverse set of receivers subject to real-time QoS constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of applications, such as distributed interactive simulation, live auctions, distributed games and collaborative systems, require the network to provide a reliable multicast service. This service enables one sender to reliably transmit data to multiple receivers. Reliability is traditionally achieved by having receivers send negative acknowledgments (NACKs) to request from the sender the retransmission of lost (or missing) data packets. However, this Automatic Repeat reQuest (ARQ) approach results in the well-known NACK implosion problem at the sender. Many reliable multicast protocols have been recently proposed to reduce NACK implosion. But, the message overhead due to NACK requests remains significant. Another approach, based on Forward Error Correction (FEC), requires the sender to encode additional redundant information so that a receiver can independently recover from losses. However, due to the lack of feedback from receivers, it is impossible for the sender to determine how much redundancy is needed. In this paper, we propose a new reliable multicast protocol, called ARM for Adaptive Reliable Multicast. Our protocol integrates ARQ and FEC techniques. The objectives of ARM are (1) reduce the message overhead due to NACK requests, (2) reduce the amount of data transmission, and (3) reduce the time it takes for all receivers to receive the data intact (without loss). During data transmission, the sender periodically informs the receivers of the number of packets that are yet to be transmitted. Based on this information, each receiver predicts whether this amount is enough to recover its losses. Only if it is not enough, that the receiver requests the sender to encode additional redundant packets. Using ns simulations, we show the superiority of our hybrid ARQ-FEC protocol over the well-known Scalable Reliable Multicast (SRM) protocol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an alternative Forward Error Correction scheme, based on Reed-Solomon codes, with the aim of protecting the transmission of RTP-multimedia streams: the inter-packet symbol approach. This scheme is based on an alternative bit structure that allocates each symbol of the Reed-Solomon code in several RTP-media packets. This characteristic permits to exploit better the recovery capability of Reed-Solomon codes against bursty packet losses. The performance of our approach has been studied in terms of encoding/decoding time versus recovery capability, and compared with other proposed schemes in the literature. The theoretical analysis has shown that our approach allows the use of a lower size of the Galois Fields compared to other solutions. This lower size results in a decrease of the required encoding/decoding time while keeping a comparable recovery capability. Finally, experimental results have been carried out to assess the performance of our approach compared to other schemes in a simulated environment, where models for wireless and wireline channels have been considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The systematic collection of high-quality mortality data is a prerequisite in designing relevant drowning prevention programmes. This descriptive study aimed to assess the quality (i.e., level of specificity) of cause-of-death reporting using ICD-10 drowning codes across 69 countries.---------- Methods: World Health Organization (WHO) mortality data were extracted for analysis. The proportion of unintentional drowning deaths coded as unspecified at the 3-character level (ICD-10 code W74) and for which the place of occurrence was unspecified at the 4th character (.9) were calculated for each country as indicators of the quality of cause-of-death reporting.---------- Results: In 32 of the 69 countries studied, the percentage of cases of unintentional drowning coded as unspecified at the 3-character level exceeded 50%, and in 19 countries, this percentage exceeded 80%; in contrast, the percentage was lower than 10% in only 10 countries. In 21 of the 56 countries that report 4-character codes, the percentage of unintentional drowning deaths for which the place of occurrence was unspecified at the 4th character exceeded 50%, and in 15 countries, exceeded 90%; in only 14 countries was this percentage lower than 10%.---------- Conclusion: Despite the introduction of more specific subcategories for drowning in the ICD-10, many countries were found to be failing to report sufficiently specific codes in drowning mortality data submitted to the WHO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the limitations of using the International Statistical Classification of Diseases and Related Health Problems, Tenth Revision, Australian Modification (ICD-10-AM) to characterise patient harm in hospitals. Limitations were identified during a project to use diagnoses flagged by Victorian coders as hospital-acquired to devise a classification of 144 categories of hospital acquired diagnoses (the Classification of Hospital Acquired Diagnoses or CHADx). CHADx is a comprehensive data monitoring system designed to allow hospitals to monitor their complication rates month-to-month using a standard method. Difficulties in identifying a single event from linear sequences of codes due to the absence of code linkage were the major obstacles to developing the classification. Obstetric and perinatal episodes also presented challenges in distinguishing condition onset, that is, whether conditions were present on admission or arose after formal admission to hospital. Used in the appropriate way, the CHADx allows hospitals to identify areas for future patient safety and quality initiatives. The value of timing information and code linkage should be recognised in the planning stages of any future electronic systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part classification and coding is still considered as laborious and time-consuming exercise. Keeping in view, the crucial role, which it plays, in developing automated CAPP systems, the attempts have been made in this article to automate a few elements of this exercise using a shape analysis model. In this study, a 24-vector directional template is contemplated to represent the feature elements of the parts (candidate and prototype). Various transformation processes such as deformation, straightening, bypassing, insertion and deletion are embedded in the proposed simulated annealing (SA)-like hybrid algorithm to match the candidate part with their prototype. For a candidate part, searching its matching prototype from the information data is computationally expensive and requires large search space. However, the proposed SA-like hybrid algorithm for solving the part classification problem considerably minimizes the search space and ensures early convergence of the solution. The application of the proposed approach is illustrated by an example part. The proposed approach is applied for the classification of 100 candidate parts and their prototypes to demonstrate the effectiveness of the algorithm. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel normally closed microcage has been fabricated and characterized. This device was made from a highly compressively stressed diamond like carbon (DLC) and electroplated Ni bimorph structure. The large stress in the DLC causes the bimorph layer to curve once it is released from the substrate. The radius of curvature is in the range of 18 - 50μm, and can be controlled by varying the DLC and the Ni thicknesses. The devices can be operated in a pulsed mode current with low operation temperature, and can be opened by ∼60μm laterally with a power consumption of only ∼16mW. © 2004 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Error correcting codes are combinatorial objects, designed to enable reliable transmission of digital data over noisy channels. They are ubiquitously used in communication, data storage etc. Error correction allows reconstruction of the original data from received word. The classical decoding algorithms are constrained to output just one codeword. However, in the late 50’s researchers proposed a relaxed error correction model for potentially large error rates known as list decoding. The research presented in this thesis focuses on reducing the computational effort and enhancing the efficiency of decoding algorithms for several codes from algorithmic as well as architectural standpoint. The codes in consideration are linear block codes closely related to Reed Solomon (RS) codes. A high speed low complexity algorithm and architecture are presented for encoding and decoding RS codes based on evaluation. The implementation results show that the hardware resources and the total execution time are significantly reduced as compared to the classical decoder. The evaluation based encoding and decoding schemes are modified and extended for shortened RS codes and software implementation shows substantial reduction in memory footprint at the expense of latency. Hermitian codes can be seen as concatenated RS codes and are much longer than RS codes over the same aphabet. A fast, novel and efficient VLSI architecture for Hermitian codes is proposed based on interpolation decoding. The proposed architecture is proven to have better than Kötter’s decoder for high rate codes. The thesis work also explores a method of constructing optimal codes by computing the subfield subcodes of Generalized Toric (GT) codes that is a natural extension of RS codes over several dimensions. The polynomial generators or evaluation polynomials for subfield-subcodes of GT codes are identified based on which dimension and bound for the minimum distance are computed. The algebraic structure for the polynomials evaluating to subfield is used to simplify the list decoding algorithm for BCH codes. Finally, an efficient and novel approach is proposed for exploiting powerful codes having complex decoding but simple encoding scheme (comparable to RS codes) for multihop wireless sensor network (WSN) applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the construction of Reed-Solomon RS(255,239) codeword is described and the process of coding and decoding a message is simulated and verified. RS(255,239), or its shortened version RS(224,208) is used as a coding technique in Low-Power Single Carrier (LPSC) physical layer, as described in IEEE 802.11ad standard. The encoder takes 239 8-bit information symbols, adds 16 parity symbols and constructs 255-byte codeword to be transmitted through wireless communication channel. RS(255,239) codeword is defined over Galois Field GF and is used for correcting upto 8 symbol errors. RS(255,239) code construction is fully implemented and Simulink test project is constructed for testing and analyzing purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different species of Leishmania can cause a variety of medically important diseases, whose control and treatment are still health problems. Telomere binding proteins (TBPs) have potential as targets for anti-parasitic chemotherapy because of their importance for genome stability and cell viability. Here, we describe LaTBP1 a protein that has a Myb-like DNA-binding domain, a feature shared by most double-stranded telomeric proteins. Binding assays using full-length and truncated LaTBP1 combined with spectroscopy analysis were used to map the boundaries of the Myb-like domain near to the protein only tryptophan residue. The Myb-like domain of LaTBP1 contains a conserved hydrophobic cavity implicated in DNA-binding activity. A hypothetical model helped to visualize that it shares structural homology with domains of other Myb-containing proteins. Competition assays and chromatin immunoprecipitation confirmed the specificity of LaTBP1 for telomeric and GT-rich DNAs, suggesting that LaTBP1 is a new TBP. (C) 2007 Elsevier B.V. All rights reserved.