48 resultados para FEC using Reed-Solomon-like codes
em Cochin University of Science
Resumo:
Cryptosystem using linear codes was developed in 1978 by Mc-Eliece. Later in 1985 Niederreiter and others developed a modified version of cryptosystem using concepts of linear codes. But these systems were not used frequently because of its larger key size. In this study we were designing a cryptosystem using the concepts of algebraic geometric codes with smaller key size. Error detection and correction can be done efficiently by simple decoding methods using the cryptosystem developed. Approach: Algebraic geometric codes are codes, generated using curves. The cryptosystem use basic concepts of elliptic curves cryptography and generator matrix. Decrypted information takes the form of a repetition code. Due to this complexity of decoding procedure is reduced. Error detection and correction can be carried out efficiently by solving a simple system of linear equations, there by imposing the concepts of security along with error detection and correction. Results: Implementation of the algorithm is done on MATLAB and comparative analysis is also done on various parameters of the system. Attacks are common to all cryptosystems. But by securely choosing curve, field and representation of elements in field, we can overcome the attacks and a stable system can be generated. Conclusion: The algorithm defined here protects the information from an intruder and also from the error in communication channel by efficient error correction methods.
Resumo:
Catalysis is an essential technology in manufacturing industries. The investigation based on supported vanadia catalysts and it’s sulfated analogues. Vanadia is a transition metal oxide and is used in oxidation reactions in chemical industry. It is more active and selective catalysts on suitable supports. The work deals with preparation of vanadia incorporated tin oxide and zirconia systems by wet impregnation. Physico-chemical characterization using instrumental techniques like BET etc. The surface acidic properties were determined by the ammonia TPD studies, Perylene absorption studies and Cumene conversion reaction. The catalytic activities of the prepared systems are tested by Friedel-Crafts benzylation of arenes and Bechmann rearrangement of Cyclohexanol oxime. Here the rector reactions are relatively rare. So to test the application of the catalyst systems for the selective oxidation of cyclohexanol to cyclohexanone and finally evaluate the catalytic activity of the systems for the vapour phase oxidative dehydrogenation of Ethylbenzene, which leads to the formation of Industrially important compound ‘styrene’ is another objective of this work
Resumo:
Physico-chemical characterization of DY203/V2O5 systems prepared through wet impregnation method has been carried out using various techniques like EDX, XRD, FTIR. thermal studies, BET surface area, pore volume and pore size distribution analysis. The amount of vanadia incorporated has been found to influence the surface properties of dysprosia. The spectroscopic results combining with X-ray analysis reveal that vanadia species exist predominantly as isolated amorphous vanadyl units along with crystalline dysprosium orthovanadate. Basicity studies have been conducted by adsorption of electron acceptors and acidity and acid strength distribution by temperature programmed desorption of ammonia. Cyclohexanol decomposition has been employed as a chemical probe reaction to examine the effect of vanadia on the acid base property of Dy2O3. Incorporation of vanadia titrates thc Lewis acid and base sites of Dy2O3, while an enhancement of Bronsted acid sites has been noticed. Data have been correlated with the catalytic activity of these oxides towards the vapour phase methylation of phenol
Resumo:
In recent years,photonics has emerged as an essential technology related to such diverse fields like laser technology,fiber optics,communication,optical signal processing,computing,entertainment,consumer electronics etc.Availabilities of semiconductor lasers and low loss fibers have also revolutionized the field of sensor technology including telemetry. There exist fiber optic sensors which are sensitive,reliable.light weight and accurate devices which find applications in wide range of areas like biomedicine,aviation,surgery,pollution monitoring etc.,apart from areas in basic sciences.The present thesis deals with the design,fabrication and characterization of a variety of cost effective and sensitive fiber optic sensors for the trace detetction of certain environment pollutants in air and water.The sensor design is carried out using the techniques like evanescent waves,micro bending and long period gratings.
Resumo:
Various synthesis routes have been developed in recent years for the preparation of nanoparticles. One of those methods is polymer induced crystallization. The first objective of the present work was to prepare nano ZnO powder by polymer induced crystallization in chitosan solution and to characterize the material using different techniques like TEM, SEM, XRD, FTLR, UV spectroscopy, TGA, DSC etc.The second object of the study is to prepare composites using nano ZnO. It has been undertaken to explore the potential of nano ZnO as reinforcement in engineering as well as commodity thermoplastics to widen their application spectra. We selected three engineering thermoplastics like [poly ethylene terephthalate, polyamide 6, and polycarbonate] and three commodity plastics like [polypropylene, high density polyethylene, and polystyrene] for the study. To date one of the few disadvantages associated with nanoparticle incorporation has concerned toughness and impact performance. Modification of polymers could reduce impact performance. The present study also focused on whether nano ZnO can act as a modifier for thennoplastics, without sacrificing their impact strength.
Resumo:
The thesis focuses on efficient design methods and reconfiguration architectures suitable for higher performance wireless communication .The work presented in this thesis describes the development of compact,inexpensive and low power communication devices that are robust,testable and capable of handling multiple communication standards.A new multistandard Decimation Filter Design Toolbox is developed in MATLAB GUIDE environment.RNS based dual-mode decimation filters reconfigurable for WCDMA/WiMAX and WCDMA/WLANa standards are designed and implemented.It offers high speed operation with lesser area requirement and lower dynamic power dissipation.A novel sigma-delta based direct analog-to-residue converter that reduces the complexity of RNS conversion circuitry is presented.The performance of an OFDM communication system with a new RRNS-convolutional concatenated coding is analysed and improved BER performance is obtained under different channel conditions. Easily testable MAC units for filters are presented using Reed-Muller logic for realization.
Resumo:
Latex protein allergy is a serious problem faced by users of natural rubber latex products. This is severe in health care workers, who are constantly using latex products like examination gloves, surgical gloves etc. Out of the total proteins only a small fraction is extractable and only these proteins cause allergic reactions in sensitized people. Enzymic deproteinisation of latex and leaching and chlorination of latex products are the common methods used to reduce the severity of the problem.Enzyme deproteinisation is a cubersome process involving high cost and process loss.Physical properties of such films are poor. Leaching is a lengthy process and in leached latex products presence of extractable proteins is observed on further storing. Chlorination causes yellowing of latex products and reduction in tensile properties.In this context a more simple process of removal of extractable proteins from latex itself was investigated. This thesis reports the application of poly propylene glycol (PPG) to displace extractable proteins from natural latex. PPG is added to 60 % centrifuged natural latex to the extent of 0.2 % m/rn, subssequently diluted to 30 % dry rubber content and again concentrated to obtain a low protein latex.Dilution of concentrated latex and subsequent concentration lead to a total reduction in non - rubber solids in the concentrate, especially proteins and reduction in the ionic concentration in the aqueous phase of the latex. It has been reported that proteins in natural rubber / latex affect its behaviour in the vulcanisation process. Ionic concentration in the aqueous phase of latex influence the stability, viscosity and flow behaviour of natural latex. Hence, a detailed technological evaluation was carried out on this low protein latex. In this study, low protein latex was compared with single centrifuged latex ( the raw material to almost every latex product), double centrifuged latex ( because dilution and second concentration of latex is accompanied by protein removal to some extent and reduction in the ionic concentration of the aqueous phase of latex.). Studies were conducted on Sulphur cure in conventional and EV systems under conditions of post ~ cure and prevulcanisation of latex. Studies were conducted on radiation cure in latex stage. Extractable protein content in vulcanised low protein latex films are observed to be very low. lt is observed that this low protein latex is some what slower curing than single centrifuged latex, but faster than double centrifuged latex. Modulus of low protein latex films were slightly low. In general physical properties of vulcanised low protein latex films are only siightly lower than single centrifuged latex. Ageing properties of the low protein latex films were satisfactory. Viscosity and flow behaviour of low protein latex is much better than double centrifuged latex and almost comparable to single centrifuged latex. On observing that the physical properties and flow behaviour of low protein latex was satisfactory, it was used for the preparation of examination gloves and the gloves were evaluated. It is observed that the properties are conforming to the Indian Standard Specifications. It is thus observed that PPG treatment of natural latex is a simple process of preparing low protein latex. Extractable protein content in these films are very low.The physical properties of the films are comparable to ordinary centrifuged latex and better than conventionally deprotenized latex films. This latex can be used for the production of examination gloves.
Resumo:
In Statistical Machine Translation from English to Malayalam, an unseen English sentence is translated into its equivalent Malayalam translation using statistical models like translation model, language model and a decoder. A parallel corpus of English-Malayalam is used in the training phase. Word to word alignments has to be set up among the sentence pairs of the source and target language before subjecting them for training. This paper is deals with the techniques which can be adopted for improving the alignment model of SMT. Incorporating the parts of speech information into the bilingual corpus has eliminated many of the insignificant alignments. Also identifying the name entities and cognates present in the sentence pairs has proved to be advantageous while setting up the alignments. Moreover, reduction of the unwanted alignments has brought in better training results. Experiments conducted on a sample corpus have generated reasonably good Malayalam translations and the results are verified with F measure, BLEU and WER evaluation metrics
Resumo:
The distribution and accumulation of trace metals in the sediments of the Cochin estuary during the pre-monsoon, monsoon and post-monsoon periods were investigated. Sediment samples from 14 locations were collected and analysed for the metal contents (Mg, Cr, Mn, Fe, Co, Ni, Cu, Zn, Cd and Pb), organic carbon, total nitrogen, total sulphur and grain size. The data were processed using statistical tools like correlation, factor and cluster analysis. The study revealed an enrichment of Cd and Zn in the study area particularly at station 2, which is confirmed by enrichment factor, contamination factor and geoaccumulation index. The factor analysis revealed that the source of Cd and Zn may be same. The study indicated that the spatial variation for the metals like Mg, Cr, Fe, Co, Ni, Cu, Zn, Cd and Pb were predominant unlike Mn which shows a temporal variation. The strong association of trace metals with Fe and Mn hydroxides and oxides are prominent along the Cochin estuary. The anthropogenic inputs of industrial effluents mainly control the trace metals enrichment in the Cochin estuary
Resumo:
There are a number of genes involved in the regulation of functional process in marine bivalves. In the case of pearl oyster, some of these genes have major role in the immune/defence function and biomineralization process involved in the pearl formation in them. As secondary filter feeders, pearl oysters are exposed to various kinds of stressors like bacteria, viruses, pesticides, industrial wastes, toxic metals and petroleum derivatives, making susceptible to diseases. Environmental changes and ambient stress also affect non-specific immunity, making the organisms vulnerable to infections. These stressors can trigger various cellular responses in the animals in their efforts to counteract the ill effects of the stress on them. These include the expression of defence related genes which encode factors such as antioxidant genes, pattern recognition receptor proteins etc. One of the strategies to combat these problems is to get insight into the disease resistance genes, and use them for disease control and health management. Similarly, although it is known that formation of pearl in molluscs is mediated by specialized proteins which are in turn regulated by specific genes encoding them, there is a paucity of sufficient information on these genes.In view of the above facts, studies on the defence related and pearl forming genes of the pearl oyster assumes importance from the point of view of both sustainable fishery management and aquaculture. At present, there is total lack of sufficient knowledge on the functional genes and their expressions in the Indian pearl oyster Pinctada fucata. Hence this work was taken up to identify and characterize the defence related and pearl forming genes, and study their expression through molecular means, in the Indian pearl oyster Pinctada fucata which are economically important for aquaculture at the southeast coast of India. The present study has successfully carried out the molecular identification, characterization and expression analysis of defence related antioxidant enzyme genes and pattern recognition proteins genes which play vital role in the defence against biotic and abiotic stressors. Antioxidant enzyme genes viz., Cu/Zn superoxide dismutase (Cu/Zn SOD), glutathione peroxidise (GPX) and glutathione-S-transferase (GST) were studied. Concerted approaches using the various molecular tools like polymerase chain reaction (PCR), random amplification of cDNA ends (RACE), molecular cloning and sequencing have resulted in the identification and characterization of full length sequences (924 bp) of the Cu/Zn SOD, most important antioxidant enzyme gene. BLAST search in NCBI confirmed the identity of the gene as Cu/Zn SOD. The presence of the characteristic amino acid sequences such as copper/zinc binding residues, family signature sequences and signal peptides were found out. Multiple sequence alignment comparison and phylogenetic analysis of the nucleotide and amino acid sequences using bioinformatics tools like BioEdit,MEGA etc revealed that the sequences were found to contain regions of diversity as well as homogeneity. Close evolutionary relationship between P. fucata and other aquatic invertebrates was revealed from the phylogenetic tree constructed using SOD amino acid sequence of P. fucata and other invertebrates as well as vertebrates
Resumo:
Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.
Resumo:
To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.
Resumo:
Communication is the process of transmitting data across channel. Whenever data is transmitted across a channel, errors are likely to occur. Coding theory is a stream of science that deals with finding efficient ways to encode and decode data, so that any likely errors can be detected and corrected. There are many methods to achieve coding and decoding. One among them is Algebraic Geometric Codes that can be constructed from curves. Cryptography is the science ol‘ security of transmitting messages from a sender to a receiver. The objective is to encrypt message in such a way that an eavesdropper would not be able to read it. A eryptosystem is a set of algorithms for encrypting and decrypting for the purpose of the process of encryption and decryption. Public key eryptosystem such as RSA and DSS are traditionally being prel‘en‘ec| for the purpose of secure communication through the channel. llowever Elliptic Curve eryptosystem have become a viable altemative since they provide greater security and also because of their usage of key of smaller length compared to other existing crypto systems. Elliptic curve cryptography is based on group of points on an elliptic curve over a finite field. This thesis deals with Algebraic Geometric codes and their relation to Cryptography using elliptic curves. Here Goppa codes are used and the curves used are elliptic curve over a finite field. We are relating Algebraic Geometric code to Cryptography by developing a cryptographic algorithm, which includes the process of encryption and decryption of messages. We are making use of fundamental properties of Elliptic curve cryptography for generating the algorithm and is used here to relate both.
Resumo:
In our study we use a kernel based classification technique, Support Vector Machine Regression for predicting the Melting Point of Drug – like compounds in terms of Topological Descriptors, Topological Charge Indices, Connectivity Indices and 2D Auto Correlations. The Machine Learning model was designed, trained and tested using a dataset of 100 compounds and it was found that an SVMReg model with RBF Kernel could predict the Melting Point with a mean absolute error 15.5854 and Root Mean Squared Error 19.7576
Resumo:
This thesis deals with some aspects of the Physics of the early universe, like phase transitions, bubble nucleations and premodial density perturbations which lead to the formation structures in the universe. Quantum aspects of the gravitational interaction play an essential role in retical high-energy physics. The questions of the quantum gravity are naturally connected with early universe and Grand Unification Theories. In spite of numerous efforts, the various problems of quantum gravity remain still unsolved. In this condition, the consideration of different quantum gravity models is an inevitable stage to study the quantum aspects of gravitational interaction. The important role of gravitationally coupled scalar field in the physics of the early universe is discussed in this thesis. The study shows that the scalar-gravitational coupling and the scalar curvature did play a crucial role in determining the nature of phase transitions that took place in the early universe. The key idea in studying the formation structure in the universe is that of gravitational instability.