13 resultados para automation roadmap
em Cochin University of Science
Resumo:
Aim of the present work was to automate CSP process, to deposit and characterize CuInS2/In2S3 layers using this system and to fabricate devices using these films.An automated spray system for the deposition of compound semiconductor thin films was designed and developed so as to eliminate the manual labour involved in spraying and facilitate standardization of the method. The system was designed such that parameters like spray rate, movement of spray head, duration of spray, temperature of substrate, pressure of carrier gas and height of the spray head from the substrate could be varied. Using this system, binary, ternary as well as quaternary films could be successfully deposited.The second part of the work deal with deposition and characterization of CuInS2 and In2S3 layers respectively.In the case of CuInS2 absorbers, the effects of different preparation conditions and post deposition treatments on the optoelectronic, morphological and structural properties were investigated. It was observed that preparation conditions and post deposition treatments played crucial role in controlling the properties of the films. The studies in this direction were useful in understanding how the variation in spray parameters tailored the properties of the absorber layer. These results were subsequently made use of in device fabrication process.Effects of copper incorporation in In2S3 films were investigated to find how the diffusion of Cu from CuInS2 to In2S3 will affect the properties at the junction. It was noticed that there was a regular variation in the opto-electronic properties with increase in copper concentration.Devices were fabricated on ITO coated glass using CuInS2 as absorber and In2S3 as buffer layer with silver as the top electrode. Stable devices could be deposited over an area of 0.25 cm2, even though the efficiency obtained was not high. Using manual spray system, we could achieve devices of area 0.01 cm2 only. Thus automation helped in obtaining repeatable results over larger areas than those obtained while using the manual unit. Silver diffusion on the cells before coating the electrodes resulted in better collection of carriers.From this work it was seen CuInS2/In2S3 junction deposited through automated spray process has potential to achieve high efficiencies.
Resumo:
The shift from print to digital information has a high impact on all components of the academic library system in India especially the users, services and the staff. Though information is considered as an important resource, the use of ICT tools to collect and disseminate information has been in a slow pace in majority of the University libraries This may be due to various factors like insufficient funds, inadequate staff trained in handling computers and software packages, administrative concerns etc. In Kerala, automation has been initiated in almost all University libraries using library automation software and is under different stages of completion. There are not much studies conducted about the effects of information communication technologies on the professional activities of library professionals in the universities in Kerala. It is important to evaluate whether progress in ICT has had any impact on the library profession in these highest educational institutions. The aim of the study is to assess whether the developments in information communication technologies have any influence on the library professionals’ professional development, and the need for further education and training in the profession and evaluate their skills in handling developments in ICT. The total population of the study is 252 including the permanently employed professional library staff in central libraries and departmental libraries in the main campuses of the universities under study. This is almost a census study of the defined population of users. The questionnaire method was adopted for collection of data for this study, supplemented by interviews of Librarians to gather additional information. Library Professionals have a positive approach towards ICT applications and services in Libraries, but majority do not have the opportunities to develop their skills and competencies in their work environment. To develop competitive personnel in a technologically advanced world, high priority must be given to develop competence in ICT applications, library management and soft skills in library professionals, by the University administrators and Library associations. Library science schools and teaching departments across the country have to take significant steps to revise library science curriculum, and incorporate significant changes to achieve the demands and challenges of library science profession.
Resumo:
Non-destructive testing (NDT) is the use of non-invasive techniques to determine the integrity of a material, component, or structure. Engineers and scientists use NDT in a variety of applications, including medical imaging, materials analysis, and process control.Photothermal beam deflection technique is one of the most promising NDT technologies. Tremendous R&D effort has been made for improving the efficiency and simplicity of this technique. It is a popular technique because it can probe surfaces irrespective of the size of the sample and its surroundings. This technique has been used to characterize several semiconductor materials, because of its non-destructive and non-contact evaluation strategy. Its application further extends to analysis of wide variety of materials. Instrumentation of a NDT technique is very crucial for any material analysis. Chapter two explores the various excitation sources, source modulation techniques, detection and signal processing schemes currently practised. The features of the experimental arrangement including the steps for alignment, automation, data acquisition and data analysis are explained giving due importance to details.Theoretical studies form the backbone of photothermal techniques. The outcome of a theoretical work is the foundation of an application.The reliability of the theoretical model developed and used is proven from the studies done on crystalline.The technique is applied for analysis of transport properties such as thermal diffusivity, mobility, surface recombination velocity and minority carrier life time of the material and thermal imaging of solar cell absorber layer materials like CuInS2, CuInSe2 and SnS thin films.analysis of In2S3 thin films, which are used as buffer layer material in solar cells. The various influences of film composition, chlorine and silver incorporation in this material is brought out from the measurement of transport properties and analysis of sub band gap levels.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention. Chapter six thus elucidates the theoretical aspects of application of photothermal techniques for solar cell analysis. The experimental design and method for determination of solar cell efficiency, optimum load resistance and series resistance with results from the analysis of CuInS2/In2S3 based solar cell forms the skeleton of this chapter.
Resumo:
In this work we present the results of our attempt to build a compact photothermal spectrometer capable of both manual and automated mode of operation.The salient features of the system include the ability to analyse thin film, powder and polymer samples. The tool has been in use to investigate thermal, optical and transport properties. Binary and ternary semiconducting thin films were analysed for their thermal diffusivities. The system could perform thickness measurements nondestructively. Ion implanted semiconductors are widely studied for the effect of radiation induced defects. We could perform nondestructive imaging of defects using our spectrometer.The results reported in his thesis on the above in addition to studies on In2S3 and transparent conducting oxide ZnO have been achieved with this spectrometer. Various polymer samples have been easily analysed for their thermal diffusivities. The technique provided ease of analysis not achieved with conventional techniques like TGA and DSC. Industrial application of the tool has also been proved by analyzing defects of welded joints and adhesion of paints. Indigenization of the expensive lock-in-amplifier and automation has been the significant achievement in the course of this dissertation. We are on our way to prove the noise rejection capabilities of our PC LIA.
Resumo:
Hevea latex is a natural biological liquid of very complex composition .Besides rubber hydrocarbons,it contains many proteinous and resinous substances,carbohydrates,inorganic matter,water,and others.The Dry Rubber Content (DRC) of latex varies according to season, tapping system,weather,soil conditions ,clone,age of the tree etc. The true DRC of the latex must be determined to ensure fair prices for the latex during commercial exchange.The DRC of Hevea latex is a very familiar term to all in the rubber industry.It has been the basis for incentive payments to tappers who bring in more than the daily agreed poundage of latex.It is an important parameter for rubber and latex processing industries for automation and verious decesion making processes.This thesis embodies the efforts made by me to determine the DRC of rubber latex following different analytical tools such as MIR absorption,thermal analysis.dielectric spectroscopy and NIR reflectance.The rubber industry is still Looking for a compact instrument that is accurate economical,easy to use and environment friendly.I hope the results presented in this thesis will help to realise this goal in the near future.
Resumo:
Gap analysis is a very useful tool for firms contemplating adoption of a new system. This paper envisages the use of the gap analysis tool as a precursor for Indian seafood exporting firms to adopt total quality management. Total quality management (TQM) is a management philosophy which strives to put quality at the forefront of all its decision-making, thereby satisfying customers. We therefore recommend that firms adopt the TQM system for better alignment of management goals. The gap analysis tool serves as a roadmap for TQM implementation, by showing the management where they actually are and where they want to be. The main gaps were found to be in the areas of usage of SPC tools (66.7%), benchmarking (65.6%), top management commitment (56.25%) and customer focus (48.1%).
Resumo:
The telemetry data processing operation intended for a given mission are pre-defined by an onboard telemetry configuration, mission trajectory and overall telemetry methodology have stabilized lately for ISRO vehicles. The given problem on telemetry data processing is reduced through hierarchical problem reduction whereby the sequencing of operations evolves as the control task and operations on data as the function task. The function task Input, Output and execution criteria are captured into tables which are examined by the control task and then schedules when the function task when the criteria is being met.
Resumo:
Information is knowledge, facts or data. For the purpose of enabling the users to assimilate information, it should be repacked. Knowledge becomes information when it is externalized i.e. put in to the process of communication. The effectiveness of communication technology depends how well it provides its clients with information rapidly, economically and authentically. A large number of ICT enabled services including OPAC; e-resources etc. are available in the university library. Studies have been done to find the impact of ICT on different sections of CUSAT library by observing the activities of different sections; discussions with colleagues and visitors; and analyzing the entries in the library records. The results of the studies are presented here in the form of a paper.
Resumo:
The Central Library of Cochin University of Science and Technology (CUSAT) has been automated by proprietary software (Adlib Library) since 2000. After 11 years, in 2011, the university authorities decided to shift to an open source software (OSS), for integrated library management system (ILMS), Koha for automating the library housekeeping operations. In this context, this study attempts to share the experiences in cataloging with both type of software. The features of the cataloging modules of both the software are analysed on the badis of certain check points. It is found that the cataloging module of Koha is almost in par with that of proven proprietary software that has been in market for the past 25 years. Some suggestions made by this study may be incorporated for the further development and perfection of Koha.
Resumo:
The paper discusses the use of online information resources for organising knowledge in library and information centres in Cochin University of Science and Technology (CUSAT). The paper discusses the status and extent of automation in CUSAT library. The use of different online resources and the purposes for which these resources are being used, is explained in detail. Structured interview method was applied for collecting data. It was observed that 67 per cent users consult online resources for assisting knowledge organisation. Library of Congress catalogue is the widely used (100 per cent) online resource followed by OPAC of CUSAT and catalogue of British Library. The main purposes for using these resources are class number building and subject indexing
Resumo:
This paper introduces a simple and efficient method and its implementation in an FPGA for reducing the odometric localization errors caused by over count readings of an optical encoder based odometric system in a mobile robot due to wheel-slippage and terrain irregularities. The detection and correction is based on redundant encoder measurements. The method suggested relies on the fact that the wheel slippage or terrain irregularities cause more count readings from the encoder than what corresponds to the actual distance travelled by the vehicle. The standard quadrature technique is used to obtain four counts in each encoder period. In this work a three-wheeled mobile robot vehicle with one driving-steering wheel and two-fixed rear wheels in-axis, fitted with incremental optical encoders is considered. The CORDIC algorithm has been used for the computation of sine and cosine terms in the update equations. The results presented demonstrate the effectiveness of the technique
Resumo:
A new localization approach to increase the navigational capabilities and object manipulation of autonomous mobile robots, based on an encoded infrared sheet of light beacon system, which provides position errors smaller than 0.02m is presented in this paper. To achieve this minimal position error, a resolution enhancement technique has been developed by utilising an inbuilt odometric/optical flow sensor information. This system respects strong low cost constraints by using an innovative assembly for the digitally encoded infrared transmitter. For better guidance of mobile robot vehicles, an online traffic signalling capability is also incorporated. Other added features are its less computational complexity and online localization capability all these without any estimation uncertainty. The constructional details, experimental results and computational methodologies of the system are also described
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.