13 resultados para Tool-workpiece contact

em Cochin University of Science


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A forward - biased point contact germanium signal diode placed inside a waveguide section along the E -vector is found to introduce significant phase shift of microwave signals . The usefulness of the arrangement as a phase modulator for microwave carriers is demonstrated. While there is a less significant amplitude modulation accompanying phase modulation , the insertion losses are found to be negligible. The observations can be explained on the basis of the capacitance variation of the barrier layer with forward current in the diode

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Photothermal effect refers to heating of a sample due to the absorption of electromagnetic radiation. Photothermal (PT) heat generation which is an example of energy conversion has in general three kinds of applications. 1. PT material probing 2. PT material processing and 3. PT material destruction. The temperatures involved increases from 1-. 3. Of the above three, PT material probing is the most important in making significant contribution to the field of science and technology. Photothermal material characterization relies on high sensitivity detection techniques to monitor the effects caused by PT material heating of a sample. Photothermal method is a powerful high sensitivity non-contact tool used for non-destructive thermal characterization of materials. The high sensitivity of the photothermal methods has led to its application for analysis of low absorbance samples. Laser calorimetry, photothermal radiometry, pyroelectric technique, photoacoustic technique, photothermal beam deflection technique, etc. come under the broad class ofphotothermal techniques. However the choice of a suitable technique depends upon the nature of the sample, purpose of measurement, nature of light source used, etc. The present investigations are done on polymer thin films employing photothermal beam deflection technique, for the successful determination of their thermal diffusivity. Here the sample is excited by a He-Ne laser (A = 6328...\ ) which acts as the pump beam. Due to the refractive index gradient established in the sample surface and in the adjacent coupling medium, another optical beam called probe beam (diode laser, A= 6500A ) when passed through this region experiences a deflection and is detected using a position sensitive detector and its output is fed to a lock-in amplifier from which the amplitude and phase of the deflection can be directly obtained. The amplitude and phase of the signal is suitably analysed for determining the thermal diffusivity.The production of polymer thin film samples has gained considerable attention for the past few years. Plasma polymerization is an inexpensive tool for fabricating organic thin films. It refers to formation of polymeric materials under the influence of plasma, which is generated by some kind of electric discharge. Here plasma of the monomer vapour is generated by employing radio frequency (MHz) techniques. Plasma polymerization technique results in homogeneous, highly adhesive, thermally stable, pinhole free, dielectric, highly branched and cross-linked polymer films. The possible linkage in the formation of the polymers is suggested by comparing the FTIR spectra of the monomer and the polymer.Near IR overtone investigations on some organic molecules using local mode model are also done. Higher vibrational overtones often provide spectral simplification and greater resolution of peaks corresponding to nonequivalent X-H bonds where X is typically C, N or O. Vibrational overtone spectroscopy of molecules containing X-H oscillators is now a well established tool for molecular investigations. Conformational and steric differences between bonds and structural inequivalence ofCH bonds (methyl, aryl, acetylenic, etc.) are resolvable in the higher overtone spectra. The local mode model in which the X-H oscillators are considered to be loosely coupled anharmonic oscillators has been widely used for the interpretation of overtone spectra. If we are exciting a single local oscillator from the vibrational ground state to the vibrational state v, then the transition energy of the local mode overtone is given by .:lE a......v = A v + B v2 • A plot of .:lE / v versus v will yield A, the local mode frequency as the intercept and B, the local mode diagonal anharmonicity as the slope. Here A - B gives the mechanical frequency XI of the oscillator and B = X2 is the anharmonicity of the bond. The local mode parameters XI and X2 vary for non-equivalent X-H bonds and are sensitive to the inter and intra molecular environment of the X-H oscillator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, the applications of the recurrence quantification analysis in metal cutting operation in a lathe, with specific objective to detect tool wear and chatter, are presented.This study is based on the discovery that process dynamics in a lathe is low dimensional chaotic. It implies that the machine dynamics is controllable using principles of chaos theory. This understanding is to revolutionize the feature extraction methodologies used in condition monitoring systems as conventional linear methods or models are incapable of capturing the critical and strange behaviors associated with the metal cutting process.As sensor based approaches provide an automated and cost effective way to monitor and control, an efficient feature extraction methodology based on nonlinear time series analysis is much more demanding. The task here is more complex when the information has to be deduced solely from sensor signals since traditional methods do not address the issue of how to treat noise present in real-world processes and its non-stationarity. In an effort to get over these two issues to the maximum possible, this thesis adopts the recurrence quantification analysis methodology in the study since this feature extraction technique is found to be robust against noise and stationarity in the signals.The work consists of two different sets of experiments in a lathe; set-I and set-2. The experiment, set-I, study the influence of tool wear on the RQA variables whereas the set-2 is carried out to identify the sensitive RQA variables to machine tool chatter followed by its validation in actual cutting. To obtain the bounds of the spectrum of the significant RQA variable values, in set-i, a fresh tool and a worn tool are used for cutting. The first part of the set-2 experiments uses a stepped shaft in order to create chatter at a known location. And the second part uses a conical section having a uniform taper along the axis for creating chatter to onset at some distance from the smaller end by gradually increasing the depth of cut while keeping the spindle speed and feed rate constant.The study concludes by revealing the dependence of certain RQA variables; percent determinism, percent recurrence and entropy, to tool wear and chatter unambiguously. The performances of the results establish this methodology to be viable for detection of tool wear and chatter in metal cutting operation in a lathe. The key reason is that the dynamics of the system under study have been nonlinear and the recurrence quantification analysis can characterize them adequately.This work establishes that principles and practice of machining can be considerably benefited and advanced from using nonlinear dynamics and chaos theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning Disability (LD) is a neurological condition that affects a child’s brain and impairs his ability to carry out one or many specific tasks. LD affects about 15 % of children enrolled in schools. The prediction of LD is a vital and intricate job. The aim of this paper is to design an effective and powerful tool, using the two intelligent methods viz., Artificial Neural Network and Adaptive Neuro-Fuzzy Inference System, for measuring the percentage of LD that affected in school-age children. In this study, we are proposing some soft computing methods in data preprocessing for improving the accuracy of the tool as well as the classifier. The data preprocessing is performed through Principal Component Analysis for attribute reduction and closest fit algorithm is used for imputing missing values. The main idea in developing the LD prediction tool is not only to predict the LD present in children but also to measure its percentage along with its class like low or minor or major. The system is implemented in Mathworks Software MatLab 7.10. The results obtained from this study have illustrated that the designed prediction system or tool is capable of measuring the LD effectively

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tourism is an industry which is heavily dependent on marketing. Mouth to mouth communication has played a major role in shaping a number of destinations.This is particularly true in modern parlance.This is social networking phenomenon which is fast spreading over the internet .Many sites provide visitors a lot of freedom to express their views.Promotion of a destination depends lot on conversation and exchange of information over these social networks.This paper analyses the social networking sites their contribution to marketing tourism and hoapitality .The negetive impacts phenomena are also discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work derives motivation from the so called surface/interfacial magnetism in core shell structures and commercial samples of Fe3O4 and c Fe2O3 with sizes ranging from 20 to 30 nm were coated with polyaniline using plasma polymerization and studied. The High Resolution Transmission Electron Microscopy images indicate a core shell structure after polyaniline coating and exhibited an increase in saturation magnetization by 2 emu/g. For confirmation, plasma polymerization was performed on maghemite nanoparticles which also exhibited an increase in saturation magnetization. This enhanced magnetization is rather surprising and the reason is found to be an interfacial phenomenon resulting from a contact potential.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need to structure knowledge is as important now as it ever has been. This paper has tried to study the ISP knowledge portal to explore how knowledge on various resources and topics in photonics and related areas are organized in the knowledge portal of International School of Photonics, CUSAT. The study revealed that ISP knowledge portal is one of the best portals in the filed. It provides a model for building an effective knowledge portal in other fields

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solid waste management nowadays is an important environmental issue in country like India. Statistics show that there has been substantial increase in the solid waste generation especially in the urban areas. This trend can be ascribed to rapid population growth, changing lifestyles, food habits, and change in living standards, lack of financial resources, institutional weaknesses, improper choice of technology and public apathy towards municipal solid waste. Waste is directly related to the consumption of resources and dumping to the land. Ecological footprint analysis – an impact assessment environment management tool makes a relationship between two factors- the amount of land required to dispose per capita generated waste. Ecological footprint analysis is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis with special emphasis to waste generation. The paper also discusses and analyses the waste footprint of Kochi city,India. An attempt is also made to suggest strategies to reduce the waste footprint thereby making the city sustainable, greener and cleaner

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kochi, the commercial capital of Kerala and the second most important city next to Mumbai on the Western coast of India, is a land having a wide variety of residential environments. The present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. This trend can be ascribed to rapid population growth, our changing lifestyles, food habits, and change in living standards, institutional weaknesses, improper choice of technology and public apathy. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. This paper analyses the scope of EFA as a sustainable environmental management tool for Kochi City

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past, natural resources were plentiful and people were scarce. But the situation is rapidly reversing. Our challenge is to find a way to balance human consumption and nature’s limited productivity in order to ensure that our communities are sustainable locally, regionally and globally. Kochi, the commercial capital of Kerala, South India and the second most important city next to Mumbai on the Western coast is a land having a wide variety of residential environments. Due to rapid population growth, changing lifestyles, food habits and living standards, institutional weaknesses, improper choice of technology and public apathy, the present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. Ecological Footprint Analysis (EFA) is physical accounting method, developed by William Rees and M. Wackernagel, focusing on land appropriation using land as its “currency”. It provides a means for measuring and communicating human induced environmental impacts upon the planet. The aim of applying EFA to Kochi city is to quantify the consumption and waste generation of a population and to compare it with the existing biocapacity. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis and calculate and analyse the ecological footprint of the residential areas of Kochi city. The paper also discusses and analyses the waste footprint of the city. An attempt is also made to suggest strategies to reduce the footprint thereby making the city sustainable

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A sandwich construction is a special form of the laminated composite consisting of light weight core, sandwiched between two stiff thin face sheets. Due to high stiffness to weight ratio, sandwich construction is widely adopted in aerospace industries. As a process dependent bonded structure, the most severe defects associated with sandwich construction are debond (skin core bond failure) and dent (locally deformed skin associated with core crushing). Reasons for debond may be attributed to initial manufacturing flaws or in service loads and dent can be caused by tool drops or impacts by foreign objects. This paper presents an evaluation on the performance of honeycomb sandwich cantilever beam with the presence of debond or dent, using layered finite element models. Dent is idealized by accounting core crushing in the core thickness along with the eccentricity of the skin. Debond is idealized using multilaminate modeling at debond location with contact element between the laminates. Vibration and buckling behavior of metallic honeycomb sandwich beam with and without damage are carried out. Buckling load factor, natural frequency, mode shape and modal strain energy are evaluated using finite element package ANSYS 13.0. Study shows that debond affect the performance of the structure more severely than dent. Reduction in the fundamental frequencies due to the presence of dent or debond is not significant for the case considered. But the debond reduces the buckling load factor significantly. Dent of size 8-20% of core thickness shows 13% reduction in buckling load capacity of the sandwich column. But debond of the same size reduced the buckling load capacity by about 90%. This underscores the importance of detecting these damages in the initiation level itself to avoid catastrophic failures. Influence of the damages on fundamental frequencies, mode shape and modal strain energy are examined. Effectiveness of these parameters as a damage detection tool for sandwich structure is also assessed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the objectives of the current investigation was to evaluate the effectiveness of Spirodela polyrhiza to remove heavy metals and other contaminants from the water samples collected from wetland sites of Eloor and Kannamaly under controlled conditions .The results obtained from the current study suggest that the test material S. polyrrhiza should be used in the biomonitoring and phytoremediation of municipal, agricultural and industrial effluents because of their simplicity, sensitivity and cost-effectiveness. The study throws light on the potential of this plant which can be used as an assessment tool in two diverse wetland in Ernakulum district. The results show the usefulness of combining physicochemical analysis with bioassays as such approach ensures better understanding of the toxicity of chemical pollutants and their influence on plant health. The results shows the suitability of Spirodela plant for surface water quality assessment as all selected parameters showed consistency with respect to water samples collected over a 3-monitoring periods. Similarly the relationship between the change in exposure period (2, 4 and 8 days) with the parameters were also studied in detail. Spirodela are consistent test material as they are homogeneous plant material; due to predominantly vegetative reproduction. New fronds are formed by clonal propagation thus, producing a population of genetically homogeneous plants. The result is small variability between treated individuals. It has been observed that phytoremediation of water samples collected from Eloor and Kannamaly using the floating plant system is a predominant method which is economic to construct, requires little maintenance and eco friendly.