11 resultados para Colour Code
em Cochin University of Science
Resumo:
This thesis is shows the result of the research work on the inherent Powers of the High Court in criminal jurisdiction. The criminal justice system in India recognizes inherent powers only of the High Court. The Theory and Philosophy of inherent powers are concerned the Distinction between civil and Criminal laws are of very little consequence. In formulating the research programme the confusion created by the concept of inherent powers and its application by High Court form the central point. How fully the concept is understood, how correctly the power is used, and how far it has enhanced the rationale of the administration of criminal justice, what is its importance and what are the solutions for the inherent power to earn a permanent status in the province of criminal jurisprudence are the themes of this study. The precipitation of new dimensions is the yardstick to acknowledge the inherent powers of the High Court and Supreme Court. It is of instant value in criminal justice system. This study concludes innovativeness provided by the inherent powers has helped the justice administration draw inspiration from the Constitution. A jurisprudence of inherent powers has developed with the weilding of inherent powers of the Supreme Court and the High Court. It is to unravel mystery of jurisprudence caused by the operation of the concept of inherent powers this research work gives emphasis. Its significance is all the more relevant when the power is exercised in the administration of criminal justice. Application or non application of inherent powers in a given case would tell upon the maturity and perfection of the standard of justice
Resumo:
Cryptosystem using linear codes was developed in 1978 by Mc-Eliece. Later in 1985 Niederreiter and others developed a modified version of cryptosystem using concepts of linear codes. But these systems were not used frequently because of its larger key size. In this study we were designing a cryptosystem using the concepts of algebraic geometric codes with smaller key size. Error detection and correction can be done efficiently by simple decoding methods using the cryptosystem developed. Approach: Algebraic geometric codes are codes, generated using curves. The cryptosystem use basic concepts of elliptic curves cryptography and generator matrix. Decrypted information takes the form of a repetition code. Due to this complexity of decoding procedure is reduced. Error detection and correction can be carried out efficiently by solving a simple system of linear equations, there by imposing the concepts of security along with error detection and correction. Results: Implementation of the algorithm is done on MATLAB and comparative analysis is also done on various parameters of the system. Attacks are common to all cryptosystems. But by securely choosing curve, field and representation of elements in field, we can overcome the attacks and a stable system can be generated. Conclusion: The algorithm defined here protects the information from an intruder and also from the error in communication channel by efficient error correction methods.
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
This thesis Entitled Colour removal from dye house effluents using zero valent iron and fenton oxidation.Findings reported on kinetic profile during oxidation of dyes with Fenton’s reagent are in good agreement with observations of earlier workers on other organic substrates. This work goes a step further. Critical concentration of the dye at which the reaction mechanism undergoes transition has been identified.The oxidation of Reactive Yellow showed that the initial rates for decolorization increased linearly with an increase in hydrogen peroxide concentration over the range studied. Fenton oxidation of all dyes except Methylene Blue showed that the initial rates increased linearly with an in the ferrous sulphate concentration. This increase was observed only up to an optimum concentration beyond which further increase resulted in a decrease in the initial rates. Variation of initial rates with Ferrous sulphate concentration resulted in a linear plot passing through the origin indicating that the reaction is first order with respect to ferrous sulphate.
Resumo:
The most common and conventional method for removing turbidity from water is by coagulating with alum or iron salts, and settling the precipitate in suitably designed clarifiers followed by filtration. But the sludge produced is bulky, difficult to dewater and accumulates in the dumping grounds causing environmental problems. Synthetic polymers such as polyacrylamide and polyethyleneoxide have been investigated for their ability to remove turbidity. They overcome many of the disadvantages of conventional methods, but are cost—effective only when rapid flocculation and reduction in sludge volume are demanded. Considering the aforementioned situation, it was felt that more easily available and eco-friendly materials must be developed for removing turbidity from water. The results of our studies in this direction are presented in this thesis. The thesis comprises of nine chapters, with a common bibliography at the end. Chapter 1 gives an introduction to the nature of turbidity and colour usually present in water. Chapter 2 discusses the nature and availability of the principal material used in these studies, namely chitosan. Chapters 3 to 8, which deal with the actual experimental work, are further subdivided into (a) introduction, (b) materials and methods, (c) results and discussion and (d) conclusions. Chapter 9 summarises the entire work so as to put the results and conclusions into proper perspective.
Resumo:
This thesis is essentially concerned with a study of the recovery of pungency-free colour matter from capsicum spice of Indian origin. A spice oleoresin may be defined as the total soluble extract of the spice in a specific solvent and embraces all the active components that contribute to aroma, taste and related sensory factors associated with the spice, together with varying amounts of pigments, plant waxes, resins and fixed oils. Whereas, in general, oleoresins are coveted for their flavour qualities, in some cases, the pigments present therein play a vital role in food technology Of these, capsicum oleoresin is the most outstanding, since it contributes both colour and flavour principles.
Resumo:
Satellite remote sensing is being effectively used in monitoring the ocean surface and its overlying atmosphere. Technical growth in the field of satellite sensors has made satellite measurement an inevitable part of oceanographic and atmospheric research. Among the ocean observing sensors, ocean colour sensors make use of visible band of electromagnetic spectrum (shorter wavelength). The use of shorter wavelength ensures fine spatial resolution of these parameters to depict oceanographic and atmospheric characteristics of any region having significant spaio-temporal variability. Off the southwest coast of India is such an area showing very significant spatio-temporal oceanographic and atmospheric variability due to the seasonally reversing surface winds and currents. Consequently, the region is enriched with features like upwelling, sinking, eddies, fronts, etc. Among them, upwelling brings nutrient-rich waters from subsurface layers to surface layers. During this process primary production enhances, which is measured in ocean colour sensors as high values of Chl a. Vertical attenuation depth of incident solar radiation (Kd) and Aerosol Optical Depth (AOD) are another two parameters provided by ocean colour sensors. Kd is also susceptible to undergo significant seasonal variability due to the changes in the content of Chl a in the water column. Moreover, Kd is affected by sediment transport in the upper layers as the region experiences land drainage resulting from copious rainfall. The wide range of variability of wind speed and direction may also influence the aerosol source / transport and consequently AOD. The present doctoral thesis concentrates on the utility of Chl a, Kd and AODprovided by satellite ocean colour sensors to understand oceanographic and atmospheric variability off the southwest coast of India. The thesis is divided into six Chapters with further subdivisions
Resumo:
The modern telecommunication industry demands higher capacity networks with high data rate. Orthogonal frequency division multiplexing (OFDM) is a promising technique for high data rate wireless communications at reasonable complexity in wireless channels. OFDM has been adopted for many types of wireless systems like wireless local area networks such as IEEE 802.11a, and digital audio/video broadcasting (DAB/DVB). The proposed research focuses on a concatenated coding scheme that improve the performance of OFDM based wireless communications. It uses a Redundant Residue Number System (RRNS) code as the outer code and a convolutional code as the inner code. The bit error rate (BER) performances of the proposed system under different channel conditions are investigated. These include the effect of additive white Gaussian noise (AWGN), multipath delay spread, peak power clipping and frame start synchronization error. The simulation results show that the proposed RRNS-Convolutional concatenated coding (RCCC) scheme provides significant improvement in the system performance by exploiting the inherent properties of RRNS.
Resumo:
Code clones are portions of source code which are similar to the original program code. The presence of code clones is considered as a bad feature of software as the maintenance of software becomes difficult due to the presence of code clones. Methods for code clone detection have gained immense significance in the last few years as they play a significant role in engineering applications such as analysis of program code, program understanding, plagiarism detection, error detection, code compaction and many more similar tasks. Despite of all these facts, several features of code clones if properly utilized can make software development process easier. In this work, we have pointed out such a feature of code clones which highlight the relevance of code clones in test sequence identification. Here program slicing is used in code clone detection. In addition, a classification of code clones is presented and the benefit of using program slicing in code clone detection is also mentioned in this work.
Resumo:
Optical Character Recognition plays an important role in Digital Image Processing and Pattern Recognition. Even though ambient study had been performed on foreign languages like Chinese and Japanese, effort on Indian script is still immature. OCR in Malayalam language is more complex as it is enriched with largest number of characters among all Indian languages. The challenge of recognition of characters is even high in handwritten domain, due to the varying writing style of each individual. In this paper we propose a system for recognition of offline handwritten Malayalam vowels. The proposed method uses Chain code and Image Centroid for the purpose of extracting features and a two layer feed forward network with scaled conjugate gradient for classification