40 resultados para FEC using Reed-Solomon and Tornado codes

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relatively high phase noise of coherent optical systems poses unique challenges for forward error correction (FEC). In this letter, we propose a novel semianalytical method for selecting combinations of interleaver lengths and binary Bose-Chaudhuri-Hocquenghem (BCH) codes that meet a target post-FEC bit error rate (BER). Our method requires only short pre-FEC simulations, based on which we design interleavers and codes analytically. It is applicable to pre-FEC BER ∼10-3, and any post-FEC BER. In addition, we show that there is a tradeoff between code overhead and interleaver delay. Finally, for a target of 10-5, numerical simulations show that interleaver-code combinations selected using our method have post-FEC BER around 2× target. The target BER is achieved with 0.1 dB extra signal-to-noise ratio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improving healthcare quality is a growing need of any society. Although various quality improvement projects are routinely deployed by the healthcare professional, they are characterised by a fragmented approach, i.e. they are not linked with the strategic intent of the organisation. This study introduces a framework which integrates all quality improvement projects with the strategic intent of the organisation. It first derives the strengths, weaknesses, opportunities and threats (SWOT) matrix of the system with the involvement of the concerned stakeholders (clinical professional), which helps identify a few projects, the implementation of which ensures achievement of desired quality. The projects are then prioritised using the analytic hierarchy process with the involvement of the concerned stakeholders (clinical professionals) and implemented in order to improve system performance. The effectiveness of the method has been demonstrated using a case study in the intensive care unit of Queen Elizabeth Hospital in Bridgetown, Barbados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A case study demonstrates the use of a process-based approach to change regarding the implementation of an information system for road traffic accident reporting in a UK police force. The supporting tools of process mapping and business process simulation are used in the change process and assist in communicating the current process design and people's roles in the overall performance of that design. The simulation model is also used to predict the performance of new designs incorporating the use of information technology. The approach is seen to have a number of advantages in the context of a public sector organisation. These include the ability for personnel to move from a traditional grouping of staff in occupational groups with relationships defined by reporting requirements to a view of their role in a process, which delivers a performance to a customer. By running the simulation through time it is also possible to gauge how changes at an operational level can lead to the meeting of strategic targets over time. Also the ability of simulation to proof new designs was seen as particularly important in a government agency were past failures of information technology investments had contributed to a more risk averse approach to their implementation. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ten cases of neuronal intermediate filament inclusion disease (NIFID) were studied quantitatively. The α-internexin positive neurofilament inclusions (NI) were most abundant in the motor cortex and CA sectors of the hippocampus. The densities of the NI and the swollen achromatic neurons (SN) were similar in laminae II/III and V/VI but glial cell density was greater in V/VI. The density of the NI was positively correlated with the SN and the glial cells. Principal components analysis (PCA) suggested that PC1 was associated with variation in neuronal loss in the frontal/temporal lobes and PC2 with neuronal loss in the frontal lobe and NI density in the parahippocampal gyrus. The data suggest: 1) frontal and temporal lobe degeneration in NIFID is associated with the widespread formation of NI and SN, 2) NI and SN affect cortical laminae II/III and V/VI, 3) the NI and SN affect closely related neuronal populations, and 4) variations in neuronal loss and in the density of NI were the most important sources of pathological heterogeneity. © Springer-Verlag 2005.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

If product cycle time reduction is the mission, and the multifunctional team is the means of achieving the mission, what then is the modus operandi by which this means is to accomplish its mission? This paper asserts that a preferred modus operandi for the multifunctional team is to adopt a process-oriented view of the manufacturing enterprise, and for this it needs the medium of a process map [16] The substance of this paper is a methodology which enables the creation of such maps Specific examples of process models drawn from the product develop ment life cycle are presented and described in order to support the methodology's integrity and value The specific deliverables we have so far obtained are a methodology for process capture and analysis, a collection of process models spanning the product development cycle, and, an engineering handbook which hosts these models and presents a computer-based means of navigating through these processes in order to allow users a better understanding of the nature of the business, their role in it, and why the job that they do benefits the work of the company We assert that this kind of thinking is the essence of concurrent engineering implementation, and further that the systemigram process models uniquely stim ulate and organise such thinking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two sets of experiments, categorized as TG–FTIR and Py–GC–FTIR, are employed to investigate the mechanism of the hemicellulose pyrolysis and the formation of main gaseous and bio-oil products. The “sharp mass loss stage” and the corresponding evolution of the volatile products are examined by the TG–FTIR graphs at the heating rate of 3–80 K/min. A pyrolysis unit, composed of fluidized bed reactor, carbon filter, vapour condensing system and gas storage, is employed to investigate the products of the hemicellulose pyrolysis under different temperatures (400–690 °C) at the feeding flow rate of 600 l/h. The effects of temperature on the condensable products are examined thoroughly. The possible routes for the formation of the products are systematically proposed from the primary decomposition of the three types of unit (xylan, O-acetylxylan and 4-O-methylglucuronic acid) and the secondary reactions of the fragments. It is found that the formation of CO is enhanced with elevated temperature, while slight change is observed for the yield of CO2 which is the predominant products in the gaseous mixture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The kinetics of the metathesis of 1-hexene using Re2O7/-Al_2O_3 as the catalyst were investigated under a variety of conditions. The experiments were carried out under high vacuum conditions. The product solutions were characterised by gas liquid chromatography and mass spectroscopy. The initial kinetics of the metathesis of 1-hexene showed that the reaction was first order in the weight of the catalyst and second order in the concentration of 1-hexene. A kinetic scheme which correlated the experimental data with the metallocarbene chain mechanism postulated by Herisson and Chauvin and the kinetics of the reaction was explained using a model based on the Langmuir-Hinshelwood theory. The low conversion of 1-hexene to its products is due to termination reactions which most likely occur by the decomposition of the metallocyclobutane intermediate to produce a cyclopropane derivative and an inactive centre. The optimum temperature for the metathesis of 1-hexene over Re_2O_7/-Al2O3 is 45oC and above this temperature, the rate of metathesis decreases rapidly. Co-catalysts alter the active sites for metathesis so that the catalyst is more selective to the metathesis of 1-hexene. However, the regeneration of metathesis activity is much worse for promoted catalysts than for the unpromoted. The synthesis and metathesis of 4,4-dimethyl-2-allowbreak (9-decenyl)-1,3-oxazoline and 4,4-dimethyl-2-allowbreak (3-pentenyl)-1,3-oxazoline was attempted and the products were analysed by thin layer chromatography, infra-red, 13C and 1H nmr and mass spectroscopy. Obtaining the oxazolines in a good yield with high purity was difficult and consequently metathesis of the impure products did not occur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method of discriminating between temperature and strain effects in fibre sensing using a conventionally written, in-fibre Bragg grating is presented. The technique uses wavelength information from the first and second diffraction orders of the grating element to determine the wavelength dependent strain and temperature coefficients, from which independent temperature and strain measurements can be made. The authors present results that validate this matrix inversion technique and quantify the strain and temperature errors which can arise for a given uncertainty in the measurement of the reflected wavelength.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tonal, textural and contextual properties are used in manual photointerpretation of remotely sensed data. This study has used these three attributes to produce a lithological map of semi arid northwest Argentina by semi automatic computer classification procedures of remotely sensed data. Three different types of satellite data were investigated, these were LANDSAT MSS, TM and SIR-A imagery. Supervised classification procedures using tonal features only produced poor classification results. LANDSAT MSS produced classification accuracies in the range of 40 to 60%, while accuracies of 50 to 70% were achieved using LANDSAT TM data. The addition of SIR-A data produced increases in the classification accuracy. The increased classification accuracy of TM over the MSS is because of the better discrimination of geological materials afforded by the middle infra red bands of the TM sensor. The maximum likelihood classifier consistently produced classification accuracies 10 to 15% higher than either the minimum distance to means or decision tree classifier, this improved accuracy was obtained at the cost of greatly increased processing time. A new type of classifier the spectral shape classifier, which is computationally as fast as a minimum distance to means classifier is described. However, the results for this classifier were disappointing, being lower in most cases than the minimum distance or decision tree procedures. The classification results using only tonal features were felt to be unacceptably poor, therefore textural attributes were investigated. Texture is an important attribute used by photogeologists to discriminate lithology. In the case of TM data, texture measures were found to increase the classification accuracy by up to 15%. However, in the case of the LANDSAT MSS data the use of texture measures did not provide any significant increase in the accuracy of classification. For TM data, it was found that second order texture, especially the SGLDM based measures, produced highest classification accuracy. Contextual post processing was found to increase classification accuracy and improve the visual appearance of classified output by removing isolated misclassified pixels which tend to clutter classified images. Simple contextual features, such as mode filters were found to out perform more complex features such as gravitational filter or minimal area replacement methods. Generally the larger the size of the filter, the greater the increase in the accuracy. Production rules were used to build a knowledge based system which used tonal and textural features to identify sedimentary lithologies in each of the two test sites. The knowledge based system was able to identify six out of ten lithologies correctly.