28 resultados para Convolutional codes over finite rings


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a theoretical method for a direct evaluation of the average error exponent in Gallager error-correcting codes using methods of statistical physics. Results for the binary symmetric channel(BSC)are presented for codes of both finite and infinite connectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a theoretical method for a direct evaluation of the average and reliability error exponents in low-density parity-check error-correcting codes using methods of statistical physics. Results for the binary symmetric channel are presented for codes of both finite and infinite connectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing demand for data transmission over digital networks involving mobile terminals. An important class of data required for transmission over mobile terminals is image information such as street maps, floor plans and identikit images. This sort of transmission is of particular interest to the service industries such as the Police force, Fire brigade, medical services and other services. These services cannot be applied directly to mobile terminals because of the limited capacity of the mobile channels and the transmission errors caused by the multipath (Rayleigh) fading. In this research, transmission of line diagram images such as floor plans and street maps, over digital networks involving mobile terminals at transmission rates of 2400 bits/s and 4800 bits/s have been studied. A low bit-rate source encoding technique using geometric codes is found to be suitable to represent line diagram images. In geometric encoding, the amount of data required to represent or store the line diagram images is proportional to the image detail. Thus a simple line diagram image would require a small amount of data. To study the effect of transmission errors due to mobile channels on the transmitted images, error sources (error files), which represent mobile channels under different conditions, have been produced using channel modelling techniques. Satisfactory models of the mobile channel have been obtained when compared to the field test measurements. Subjective performance tests have been carried out to evaluate the quality and usefulness of the received line diagram images under various mobile channel conditions. The effect of mobile transmission errors on the quality of the received images has been determined. To improve the quality of the received images under various mobile channel conditions, forward error correcting codes (FEC) with interleaving and automatic repeat request (ARQ) schemes have been proposed. The performance of the error control codes have been evaluated under various mobile channel conditions. It has been shown that a FEC code with interleaving can be used effectively to improve the quality of the received images under normal and severe mobile channel conditions. Under normal channel conditions, similar results have been obtained when using ARQ schemes. However, under severe mobile channel conditions, the FEC code with interleaving shows better performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we use statistical physics techniques to study the typical performance of four families of error-correcting codes based on very sparse linear transformations: Sourlas codes, Gallager codes, MacKay-Neal codes and Kanter-Saad codes. We map the decoding problem onto an Ising spin system with many-spins interactions. We then employ the replica method to calculate averages over the quenched disorder represented by the code constructions, the arbitrary messages and the random noise vectors. We find, as the noise level increases, a phase transition between successful decoding and failure phases. This phase transition coincides with upper bounds derived in the information theory literature in most of the cases. We connect the practical decoding algorithm known as probability propagation with the task of finding local minima of the related Bethe free-energy. We show that the practical decoding thresholds correspond to noise levels where suboptimal minima of the free-energy emerge. Simulations of practical decoding scenarios using probability propagation agree with theoretical predictions of the replica symmetric theory. The typical performance predicted by the thermodynamic phase transitions is shown to be attainable in computation times that grow exponentially with the system size. We use the insights obtained to design a method to calculate the performance and optimise parameters of the high performance codes proposed by Kanter and Saad.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis deals with the development and application of a finite element program for the analysis of several cracked structures. In order to simplify the organisation of the material presented herein, the thesis has been subdivided into two Sections : In the first Section the development of a finite element program for the analysis of two-dimensional problems of plane stress or plane strain is described. The element used in this program is the six-mode isoparametric triangular element which permits the accurate modelling of curved boundary surfaces. Various cases of material aniftropy are included in the derivation of the element stiffness properties. A digital computer program is described and examples of its application are presented. In the second Section, on fracture problems, several cracked configurations are analysed by embedding into the finite element mesh a sub-region, containing the singularities and over which an analytic solution is used. The modifications necessary to augment a standard finite element program, such as that developed in Section I, are discussed and complete programs for each cracked configuration are presented. Several examples are included to demonstrate the accuracy and flexibility of the technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present dissertation is concerned with the determination of the magnetic field distribution in ma[.rnetic electron lenses by means of the finite element method. In the differential form of this method a Poisson type equation is solved by numerical methods over a finite boundary. Previous methods of adapting this procedure to the requirements of digital computers have restricted its use to computers of extremely large core size. It is shown that by reformulating the boundary conditions, a considerable reduction in core store can be achieved for a given accuracy of field distribution. The magnetic field distribution of a lens may also be calculated by the integral form of the finite element rnethod. This eliminates boundary problems mentioned but introduces other difficulties. After a careful analysis of both methods it has proved possible to combine the advantages of both in a .new approach to the problem which may be called the 'differential-integral' finite element method. The application of this method to the determination of the magnetic field distribution of some new types of magnetic lenses is described. In the course of the work considerable re-programming of standard programs was necessary in order to reduce the core store requirements to a minimum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the experimental and theoretical investigation into the compression bond of column longitudinal reinforcement in the transference of axial load from a reinforced concrete column to a base. Experimental work includes twelve tests with square twisted bars and twenty four tests with ribbed bars. The effects of bar size, anchorage length in the base, plan area of the base, provision of bae tensile reinforcement, links around the column bars in the base, plan area of column and concrete compressive strength were investigated in the tests. The tests indicated that the strength of the compression anchorage of deformed reinforcing steel in the concrete was primarily dependent on the concrete strength and the resistance to bursting, which may be available within the anchorage . It was shown in the tests without concreted columns that due to a large containment over the bars in the foundation, failure occurred due to the breakdown of bond followed by the slip of the column bars along the anchorage length. The experimental work showed that the bar size , the stress in the bar, the anchorage length, provision of the transverse steel and the concrete compressive strength significantly affect the bond stress at failure. The ultimate bond stress decreases as the anchorage length is increased, while the ultimate bond stress increases with increasing each of the remainder parameters. Tests with concreted columns also indicated that a section of the column contributed to the bond length in the foundation by acting as an extra anchorage length. The theoretical work is based on the Mindlin equation( 3), an analytical method used in conjunction with finite difference calculus. The theory is used to plot the distribution of bond stress in the elastic and the elastic-plastic stage of behaviour. The theory is also used to plot the load-vertical displacement relationship of the column bars in the anchorage length, and also to determine the theoretical failure load of foundation. The theoretical solutions are in good agreement with the experimental results and the distribution of bond stress is shown to be significantly influenced by the bar stiffness factor K. A comparison of the experimental results with the current codes shows that the bond stresses currently used are low and in particular, CPIlO(56) specifies very conservative design bond stresses .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human accommodation system has been extensively examined for over a century, with a particular focus on trying to understand the mechanisms that lead to the loss of accommodative ability with age (Presbyopia). The accommodative process, along with the potential causes of presbyopia, are disputed; hindering efforts to develop methods of restoring accommodation in the presbyopic eye. One method that can be used to provide insight into this complex area is Finite Element Analysis (FEA). The effectiveness of FEA in modelling the accommodative process has been illustrated by a number of accommodative FEA models developed to date. However, there have been limitations to these previous models; principally due to the variation in data on the geometry of the accommodative components, combined with sparse measurements of their material properties. Despite advances in available data, continued oversimplification has occurred in the modelling of the crystalline lens structure and the zonular fibres that surround the lens. A new accommodation model was proposed by the author that aims to eliminate these limitations. A novel representation of the zonular structure was developed, combined with updated lens and capsule modelling methods. The model has been designed to be adaptable so that a range of different age accommodation systems can be modelled, allowing the age related changes that occur to be simulated. The new modelling methods were validated by comparing the changes induced within the model to available in vivo data, leading to the definition of three different age models. These were used in an extended sensitivity study on age related changes, where individual parameters were altered to investigate their effect on the accommodative process. The material properties were found to have the largest impact on the decline in accommodative ability, in particular compared to changes in ciliary body movement or zonular structure. Novel data on the importance of the capsule stiffness and thickness was also established. The new model detailed within this thesis provides further insight into the accommodation mechanism, as well as a foundation for future, more detailed investigations into accommodation, presbyopia and accommodative restoration techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Focal points: ICD-10 codings and spontaneous yellow card reports for warfarin toxicity were compared retrospectively over a one-year period Eighteen cases of ICD-10 coded warfarin toxicity were identified from a total of 55,811 coded episodes More than three times as many ADRs to warfarin were found by screening ICD-10 codes as were reported spontaneously using the yellow card scheme Valuable information is being lost to regulatory authorities and as recognised reporters to the yellow card scheme, pharmacists are well placed to report these ADRs, enhancing their role in the safe and appropriate prescribing of warfarin

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, learning word vector representations has attracted much interest in Natural Language Processing. Word representations or embeddings learned using unsupervised methods help addressing the problem of traditional bag-of-word approaches which fail to capture contextual semantics. In this paper we go beyond the vector representations at the word level and propose a novel framework that learns higher-level feature representations of n-grams, phrases and sentences using a deep neural network built from stacked Convolutional Restricted Boltzmann Machines (CRBMs). These representations have been shown to map syntactically and semantically related n-grams to closeby locations in the hidden feature space. We have experimented to additionally incorporate these higher-level features into supervised classifier training for two sentiment analysis tasks: subjectivity classification and sentiment classification. Our results have demonstrated the success of our proposed framework with 4% improvement in accuracy observed for subjectivity classification and improved the results achieved for sentiment classification over models trained without our higher level features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some species of crustose lichens, such as Ochrolechia parella (L.) Massal., exhibit concentric marginal rings, which may represent an alternative technique of measuring growth rates and potentially, a new lichenometric dating method. To examine this hypothesis, the agreement and correlation between ring widths and directly measured annual radial growth rates (RaGR, mm a-1) were studied in 24 thalli of O. parella in north Wales, UK, using digital photography and image analysis. Variation in ring width was observed at different locations around a thallus, between thalli, and from year to year. The best agreement and correlation between ring width and lichen growth rates was between mean width of the outer two rings (measured in 2011) and mean RaGR (in 2009/10). The O. parella data suggest that mean width of the youngest two growth rings, averaged over a sample of thalli, is a predictor of recent growth rates and therefore could be used in lichenometry. Potential applications include as a convenient method of comparing lichen growth rates on surfaces in different environmental settings; and as an alternative method of constructing lichen growth-rate curves, without having to revisit the same lichen thalli over many years. However, care is needed when using growth rings to estimate growth rates as: growth ring widths may not be stable; ring widths exhibit spatial and temporal variation; rings may not represent 1-year's growth in all thalli; and adjacent rings may not always represent successive year's growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As one of the most popular deep learning models, convolution neural network (CNN) has achieved huge success in image information extraction. Traditionally CNN is trained by supervised learning method with labeled data and used as a classifier by adding a classification layer in the end. Its capability of extracting image features is largely limited due to the difficulty of setting up a large training dataset. In this paper, we propose a new unsupervised learning CNN model, which uses a so-called convolutional sparse auto-encoder (CSAE) algorithm pre-Train the CNN. Instead of using labeled natural images for CNN training, the CSAE algorithm can be used to train the CNN with unlabeled artificial images, which enables easy expansion of training data and unsupervised learning. The CSAE algorithm is especially designed for extracting complex features from specific objects such as Chinese characters. After the features of articficial images are extracted by the CSAE algorithm, the learned parameters are used to initialize the first CNN convolutional layer, and then the CNN model is fine-Trained by scene image patches with a linear classifier. The new CNN model is applied to Chinese scene text detection and is evaluated with a multilingual image dataset, which labels Chinese, English and numerals texts separately. More than 10% detection precision gain is observed over two CNN models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Finite Element Analysis (FEA) model is used to explore the relationship between clogging and hydraulics that occurs in Horizontal Subsurface Flow Treatment Wetlands (HSSF TWs) in the United Kingdom (UK). Clogging is assumed to be caused by particle transport and an existing single collector efficiency model is implemented to describe this behaviour. The flow model was validated against HSSF TW survey results obtained from the literature. The model successfully simulated the influence of overland flow on hydrodynamics, and the interaction between vertical flow through the low permeability surface layer and the horizontal flow of the saturated water table. The clogging model described the development of clogging within the system but under-predicted the extent of clogging which occurred over 15 years. This is because important clogging mechanisms were not considered by the model, such as biomass growth and vegetation establishment. The model showed the usefulness of FEA for linking hydraulic and clogging phenomenon in HSSF TWs and could be extended to include treatment processes. © 2011 Springer Science+Business Media B.V.