14 resultados para Code- switching
em Cochin University of Science
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
This thesis is shows the result of the research work on the inherent Powers of the High Court in criminal jurisdiction. The criminal justice system in India recognizes inherent powers only of the High Court. The Theory and Philosophy of inherent powers are concerned the Distinction between civil and Criminal laws are of very little consequence. In formulating the research programme the confusion created by the concept of inherent powers and its application by High Court form the central point. How fully the concept is understood, how correctly the power is used, and how far it has enhanced the rationale of the administration of criminal justice, what is its importance and what are the solutions for the inherent power to earn a permanent status in the province of criminal jurisprudence are the themes of this study. The precipitation of new dimensions is the yardstick to acknowledge the inherent powers of the High Court and Supreme Court. It is of instant value in criminal justice system. This study concludes innovativeness provided by the inherent powers has helped the justice administration draw inspiration from the Constitution. A jurisprudence of inherent powers has developed with the weilding of inherent powers of the Supreme Court and the High Court. It is to unravel mystery of jurisprudence caused by the operation of the concept of inherent powers this research work gives emphasis. Its significance is all the more relevant when the power is exercised in the administration of criminal justice. Application or non application of inherent powers in a given case would tell upon the maturity and perfection of the standard of justice
Resumo:
Photonic band-gap (PBG) structures are utilized in microwave components as filters to suppress unwanted signals. In this work, rectangular perforations were created in the ground plane of a microstrip line to construct a PBG structure. A gold-coated alumina substrate was utilized to switch or tune the bandstop characteristics of this structure. It was demonstrated that the bandstop characteristics were switched off from - 35 to - 1 dB at 16 GHz. Tuning of the bandstop edge with a shift of 1.5 GHz was also shown
Resumo:
Cryptosystem using linear codes was developed in 1978 by Mc-Eliece. Later in 1985 Niederreiter and others developed a modified version of cryptosystem using concepts of linear codes. But these systems were not used frequently because of its larger key size. In this study we were designing a cryptosystem using the concepts of algebraic geometric codes with smaller key size. Error detection and correction can be done efficiently by simple decoding methods using the cryptosystem developed. Approach: Algebraic geometric codes are codes, generated using curves. The cryptosystem use basic concepts of elliptic curves cryptography and generator matrix. Decrypted information takes the form of a repetition code. Due to this complexity of decoding procedure is reduced. Error detection and correction can be carried out efficiently by solving a simple system of linear equations, there by imposing the concepts of security along with error detection and correction. Results: Implementation of the algorithm is done on MATLAB and comparative analysis is also done on various parameters of the system. Attacks are common to all cryptosystems. But by securely choosing curve, field and representation of elements in field, we can overcome the attacks and a stable system can be generated. Conclusion: The algorithm defined here protects the information from an intruder and also from the error in communication channel by efficient error correction methods.
Resumo:
Department of Instrumentation, Cochin University of Science and Technology
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
We demonstrate the possibility of realizing, all-optical switching in gold nanosol. Two overlapping laser beams are used for this purpose, due to which a low-power beam passing collinear to a high-power beam will undergo cross phase modulation and thereby distort the spatial profile. This is taken to advantage for performing logic operations. We have also measured the threshold pump power to obtain a NOT gate and the minimum response time of the device. Contrary to the general notion that the response time of thermal effects used in this application is of the order of milliseconds, we prove that short pump pulses can result in fast switching. Different combinations of beam splitters and combiners will lead to the formation of other logic functions too.
Resumo:
This thesis Entitled Electrical switching studies on the thin flims of polyfuran and polyacrylonitrile prepared by plasma polymerisation and vacuum evaporated amorphous silicon.A general introduction to the switching and allied phenomena is presented. Subsequently, developments of switching in thin films are described. The Mott transition is qualitatively presented. The working of a switching transitor is outlined and compared to the switching observed in thin films. Characteristic parameters of switching such as threshold voltage, time response to a, voltage pulse, and delay time are described. The various switching configurations commonly used are discussed. The mechanisms used to explain the switching behaviour like thermal, electrothermal and purely electronic are reviewed. Finally the scope, feasibility and the importance of polymer thin films in switching are highlighted.
Resumo:
This thesis investigates the potential use of zerocrossing information for speech sample estimation. It provides 21 new method tn) estimate speech samples using composite zerocrossings. A simple linear interpolation technique is developed for this purpose. By using this method the A/D converter can be avoided in a speech coder. The newly proposed zerocrossing sampling theory is supported with results of computer simulations using real speech data. The thesis also presents two methods for voiced/ unvoiced classification. One of these methods is based on a distance measure which is a function of short time zerocrossing rate and short time energy of the signal. The other one is based on the attractor dimension and entropy of the signal. Among these two methods the first one is simple and reguires only very few computations compared to the other. This method is used imtea later chapter to design an enhanced Adaptive Transform Coder. The later part of the thesis addresses a few problems in Adaptive Transform Coding and presents an improved ATC. Transform coefficient with maximum amplitude is considered as ‘side information’. This. enables more accurate tfiiz assignment enui step—size computation. A new bit reassignment scheme is also introduced in this work. Finally, sum ATC which applies switching between luiscrete Cosine Transform and Discrete Walsh-Hadamard Transform for voiced and unvoiced speech segments respectively is presented. Simulation results are provided to show the improved performance of the coder
Resumo:
The modern telecommunication industry demands higher capacity networks with high data rate. Orthogonal frequency division multiplexing (OFDM) is a promising technique for high data rate wireless communications at reasonable complexity in wireless channels. OFDM has been adopted for many types of wireless systems like wireless local area networks such as IEEE 802.11a, and digital audio/video broadcasting (DAB/DVB). The proposed research focuses on a concatenated coding scheme that improve the performance of OFDM based wireless communications. It uses a Redundant Residue Number System (RRNS) code as the outer code and a convolutional code as the inner code. The bit error rate (BER) performances of the proposed system under different channel conditions are investigated. These include the effect of additive white Gaussian noise (AWGN), multipath delay spread, peak power clipping and frame start synchronization error. The simulation results show that the proposed RRNS-Convolutional concatenated coding (RCCC) scheme provides significant improvement in the system performance by exploiting the inherent properties of RRNS.
Resumo:
Code clones are portions of source code which are similar to the original program code. The presence of code clones is considered as a bad feature of software as the maintenance of software becomes difficult due to the presence of code clones. Methods for code clone detection have gained immense significance in the last few years as they play a significant role in engineering applications such as analysis of program code, program understanding, plagiarism detection, error detection, code compaction and many more similar tasks. Despite of all these facts, several features of code clones if properly utilized can make software development process easier. In this work, we have pointed out such a feature of code clones which highlight the relevance of code clones in test sequence identification. Here program slicing is used in code clone detection. In addition, a classification of code clones is presented and the benefit of using program slicing in code clone detection is also mentioned in this work.
Resumo:
Optical Character Recognition plays an important role in Digital Image Processing and Pattern Recognition. Even though ambient study had been performed on foreign languages like Chinese and Japanese, effort on Indian script is still immature. OCR in Malayalam language is more complex as it is enriched with largest number of characters among all Indian languages. The challenge of recognition of characters is even high in handwritten domain, due to the varying writing style of each individual. In this paper we propose a system for recognition of offline handwritten Malayalam vowels. The proposed method uses Chain code and Image Centroid for the purpose of extracting features and a two layer feed forward network with scaled conjugate gradient for classification
Resumo:
Bank switching in embedded processors having partitioned memory architecture results in code size as well as run time overhead. An algorithm and its application to assist the compiler in eliminating the redundant bank switching codes introduced and deciding the optimum data allocation to banked memory is presented in this work. A relation matrix formed for the memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Data allocation to memory is done by considering all possible permutation of memory banks and combination of data. The compiler output corresponding to each data mapping scheme is subjected to a static machine code analysis which identifies the one with minimum number of bank switching codes. Even though the method is compiler independent, the algorithm utilizes certain architectural features of the target processor. A prototype based on PIC 16F87X microcontrollers is described. This method scales well into larger number of memory blocks and other architectures so that high performance compilers can integrate this technique for efficient code generation. The technique is illustrated with an example
Resumo:
In a business environment that is characterized by intense competition, building customer loyalty has become a key area of focus for most financial institutions. The explosion of the services sector, changing customer demographics and deregulation and emergence of new technology in the financial services industry have had a critical impact on consumers’ financial services buying behaviour. The changes have forced banks to modify their service offerings to customers so as to ensure high levels of customer satisfaction and also high levels of customer retention. Banks have historically had difficulty distinguishing their products from one another because of their relative homogeneity; with increasing competition,the problem has only intensified with no coherent distinguishing theme. Rising wealth, product proliferation, regulatory changes and newer technologies are together making bank switching easier for customers. In order to remain competitive, it is important for banks to retain their customer base. The financial services sector is the foundation for any economy and plays the role of mobilization of resources and their allocation. The retail banking sector in India has emerged as one of the major drivers of the overall banking industry and has witnessed enormous growth. Switching behaviour has a negative impact on the banks’ market share and profitability as the costs of acquiring customers are much higher than the costs of retaining. When customers switch, the business loses the potential for additional profits from the customer the initial costs invested in the customer by the business get . The Objective of the thesis was to examine the relationship among triggers that customers experience, their perceptions of service quality, consumers’ commitment and behavioral intentions in the contemporary India retail banking context through the eyes of the customer. To understand customers’ perception of these aspects, data were collected from retail banking customers alone for the purpose of analysis, though the banks’ views were considered during the qualitative work carried out prior to the main study. No respondent who is an employee of a banking organization was considered for the final study to avoid the possibility of any bias that could affect the results adversely. The data for the study were collected from customers who have switched banks and from those who were non switchers. The study attempted to develop and validate a multidimensional construct of service quality for retail banking from the consumer’s perspective. A major conclusion from the empirical research was the confirmation of the multidimensional construct for perceived service quality in the banking context. Switching can be viewed as an optimization problem for customers; customers review the potential gains of switching to another service provider against the costs of leaving the service provider. As banks do not provide tangible products, their service quality is usually assessed through service provider’s relationship with customers. Thus, banks should pay attention towards their employees’ skills and knowledge; assessing customers’ needs and offering fast and efficient services.