8 resultados para Tool Development

em Cochin University of Science


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the past, natural resources were plentiful and people were scarce. But the situation is rapidly reversing. Our challenge is to find a way to balance human consumption and nature’s limited productivity in order to ensure that our communities are sustainable locally, regionally and globally. Kochi, the commercial capital of Kerala, South India and the second most important city next to Mumbai on the Western coast is a land having a wide variety of residential environments. Due to rapid population growth, changing lifestyles, food habits and living standards, institutional weaknesses, improper choice of technology and public apathy, the present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. Ecological Footprint Analysis (EFA) is physical accounting method, developed by William Rees and M. Wackernagel, focusing on land appropriation using land as its “currency”. It provides a means for measuring and communicating human induced environmental impacts upon the planet. The aim of applying EFA to Kochi city is to quantify the consumption and waste generation of a population and to compare it with the existing biocapacity. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis and calculate and analyse the ecological footprint of the residential areas of Kochi city. The paper also discusses and analyses the waste footprint of the city. An attempt is also made to suggest strategies to reduce the footprint thereby making the city sustainable

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During 1990's the Wavelet Transform emerged as an important signal processing tool with potential applications in time-frequency analysis and non-stationary signal processing.Wavelets have gained popularity in broad range of disciplines like signal/image compression, medical diagnostics, boundary value problems, geophysical signal processing, statistical signal processing,pattern recognition,underwater acoustics etc.In 1993, G. Evangelista introduced the Pitch- synchronous Wavelet Transform, which is particularly suited for pseudo-periodic signal processing.The work presented in this thesis mainly concentrates on two interrelated topics in signal processing,viz. the Wavelet Transform based signal compression and the computation of Discrete Wavelet Transform. A new compression scheme is described in which the Pitch-Synchronous Wavelet Transform technique is combined with the popular linear Predictive Coding method for pseudo-periodic signal processing. Subsequently,A novel Parallel Multiple Subsequence structure is presented for the efficient computation of Wavelet Transform. Case studies also presented to highlight the potential applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the development and analysis of an Isosceles Trapezoidal Dielectric Resonator Antenna (ITDRA) by realizing different DR orientations with suitable feed configurations enabling it to be used as multiband, dual band dual polarized and wideband applications. The motivation for this work has been inspired by the need for compact, high efficient, low cost antenna suitable for multi band application, dual band dual polarized operation and broadband operation with the possibility of using with MICs, and to ensure less expensive, more efficient and quality wireless communication systems. To satisfy these challenging demands a novel shaped Dielectric Resonator (DR) is fabricated and investigated for the possibility of above required properties by trying out different orientations of the DR on a simple microstrip feed and with slotted ground plane as well. The thesis initially discusses and evaluates recent and past developments taken place within the microwave industry on this topic through a concise review of literature. Then the theoretical aspects of DRA and different feeding techniques are described. Following this, fabrication and characterization of DRA is explained. To achieve the desired requirements as above both simulations and experimental measurements were undertaken. A 3-D finite element method (FEM) electromagnetic simulation tool, HFSSTM by Agilent, is used to determine the optimum geometry of the dielectric resonator. It was found to be useful in producing approximate results although it had some limitations. A numerical analysis technique, finite difference time domain (FDTD) is used for validating the results of wide band design at the end. MATLAB is used for modeling the ITDR and implementing FDTD analysis. In conclusion this work offers a new, efficient and relatively simple alternative for antennas to be used for multiple requirements in the wireless communication system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The basic concepts of digital signal processing are taught to the students in engineering and science. The focus of the course is on linear, time invariant systems. The question as to what happens when the system is governed by a quadratic or cubic equation remains unanswered in the vast majority of literature on signal processing. Light has been shed on this problem when John V Mathews and Giovanni L Sicuranza published the book Polynomial Signal Processing. This book opened up an unseen vista of polynomial systems for signal and image processing. The book presented the theory and implementations of both adaptive and non-adaptive FIR and IIR quadratic systems which offer improved performance than conventional linear systems. The theory of quadratic systems presents a pristine and virgin area of research that offers computationally intensive work. Once the area of research is selected, the next issue is the choice of the software tool to carry out the work. Conventional languages like C and C++ are easily eliminated as they are not interpreted and lack good quality plotting libraries. MATLAB is proved to be very slow and so do SCILAB and Octave. The search for a language for scientific computing that was as fast as C, but with a good quality plotting library, ended up in Python, a distant relative of LISP. It proved to be ideal for scientific computing. An account of the use of Python, its scientific computing package scipy and the plotting library pylab is given in the appendix Initially, work is focused on designing predictors that exploit the polynomial nonlinearities inherent in speech generation mechanisms. Soon, the work got diverted into medical image processing which offered more potential to exploit by the use of quadratic methods. The major focus in this area is on quadratic edge detection methods for retinal images and fingerprints as well as de-noising raw MRI signals

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid waste management nowadays is an important environmental issue in country like India. Statistics show that there has been substantial increase in the solid waste generation especially in the urban areas. This trend can be ascribed to rapid population growth, changing lifestyles, food habits, and change in living standards, lack of financial resources, institutional weaknesses, improper choice of technology and public apathy towards municipal solid waste. Waste is directly related to the consumption of resources and dumping to the land. Ecological footprint analysis – an impact assessment environment management tool makes a relationship between two factors- the amount of land required to dispose per capita generated waste. Ecological footprint analysis is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis with special emphasis to waste generation. The paper also discusses and analyses the waste footprint of Kochi city,India. An attempt is also made to suggest strategies to reduce the waste footprint thereby making the city sustainable, greener and cleaner

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kochi, the commercial capital of Kerala and the second most important city next to Mumbai on the Western coast of India, is a land having a wide variety of residential environments. The present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. This trend can be ascribed to rapid population growth, our changing lifestyles, food habits, and change in living standards, institutional weaknesses, improper choice of technology and public apathy. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. This paper analyses the scope of EFA as a sustainable environmental management tool for Kochi City

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semiconductor industry's urge towards faster, smaller and cheaper integrated circuits has lead the industry to smaller node devices. The integrated circuits that are now under volume production belong to 22 nm and 14 nm technology nodes. In 2007 the 45 nm technology came with the revolutionary high- /metal gate structure. 22 nm technology utilizes fully depleted tri-gate transistor structure. The 14 nm technology is a continuation of the 22 nm technology. Intel is using second generation tri-gate technology in 14 nm devices. After 14 nm, the semiconductor industry is expected to continue the scaling with 10 nm devices followed by 7 nm. Recently, IBM has announced successful production of 7 nm node test chips. This is the fashion how nanoelectronics industry is proceeding with its scaling trend. For the present node of technologies selective deposition and selective removal of the materials are required. Atomic layer deposition and the atomic layer etching are the respective techniques used for selective deposition and selective removal. Atomic layer deposition still remains as a futuristic manufacturing approach that deposits materials and lms in exact places. In addition to the nano/microelectronics industry, ALD is also widening its application areas and acceptance. The usage of ALD equipments in industry exhibits a diversi cation trend. With this trend, large area, batch processing, particle ALD and plasma enhanced like ALD equipments are becoming prominent in industrial applications. In this work, the development of an atomic layer deposition tool with microwave plasma capability is described, which is a ordable even for lightly funded research labs.