898 resultados para desig automation of robots
Resumo:
The shift from print to digital information has a high impact on all components of the academic library system in India especially the users, services and the staff. Though information is considered as an important resource, the use of ICT tools to collect and disseminate information has been in a slow pace in majority of the University libraries This may be due to various factors like insufficient funds, inadequate staff trained in handling computers and software packages, administrative concerns etc. In Kerala, automation has been initiated in almost all University libraries using library automation software and is under different stages of completion. There are not much studies conducted about the effects of information communication technologies on the professional activities of library professionals in the universities in Kerala. It is important to evaluate whether progress in ICT has had any impact on the library profession in these highest educational institutions. The aim of the study is to assess whether the developments in information communication technologies have any influence on the library professionals’ professional development, and the need for further education and training in the profession and evaluate their skills in handling developments in ICT. The total population of the study is 252 including the permanently employed professional library staff in central libraries and departmental libraries in the main campuses of the universities under study. This is almost a census study of the defined population of users. The questionnaire method was adopted for collection of data for this study, supplemented by interviews of Librarians to gather additional information. Library Professionals have a positive approach towards ICT applications and services in Libraries, but majority do not have the opportunities to develop their skills and competencies in their work environment. To develop competitive personnel in a technologically advanced world, high priority must be given to develop competence in ICT applications, library management and soft skills in library professionals, by the University administrators and Library associations. Library science schools and teaching departments across the country have to take significant steps to revise library science curriculum, and incorporate significant changes to achieve the demands and challenges of library science profession.
Resumo:
Non-destructive testing (NDT) is the use of non-invasive techniques to determine the integrity of a material, component, or structure. Engineers and scientists use NDT in a variety of applications, including medical imaging, materials analysis, and process control.Photothermal beam deflection technique is one of the most promising NDT technologies. Tremendous R&D effort has been made for improving the efficiency and simplicity of this technique. It is a popular technique because it can probe surfaces irrespective of the size of the sample and its surroundings. This technique has been used to characterize several semiconductor materials, because of its non-destructive and non-contact evaluation strategy. Its application further extends to analysis of wide variety of materials. Instrumentation of a NDT technique is very crucial for any material analysis. Chapter two explores the various excitation sources, source modulation techniques, detection and signal processing schemes currently practised. The features of the experimental arrangement including the steps for alignment, automation, data acquisition and data analysis are explained giving due importance to details.Theoretical studies form the backbone of photothermal techniques. The outcome of a theoretical work is the foundation of an application.The reliability of the theoretical model developed and used is proven from the studies done on crystalline.The technique is applied for analysis of transport properties such as thermal diffusivity, mobility, surface recombination velocity and minority carrier life time of the material and thermal imaging of solar cell absorber layer materials like CuInS2, CuInSe2 and SnS thin films.analysis of In2S3 thin films, which are used as buffer layer material in solar cells. The various influences of film composition, chlorine and silver incorporation in this material is brought out from the measurement of transport properties and analysis of sub band gap levels.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention. Chapter six thus elucidates the theoretical aspects of application of photothermal techniques for solar cell analysis. The experimental design and method for determination of solar cell efficiency, optimum load resistance and series resistance with results from the analysis of CuInS2/In2S3 based solar cell forms the skeleton of this chapter.
Resumo:
In this work we present the results of our attempt to build a compact photothermal spectrometer capable of both manual and automated mode of operation.The salient features of the system include the ability to analyse thin film, powder and polymer samples. The tool has been in use to investigate thermal, optical and transport properties. Binary and ternary semiconducting thin films were analysed for their thermal diffusivities. The system could perform thickness measurements nondestructively. Ion implanted semiconductors are widely studied for the effect of radiation induced defects. We could perform nondestructive imaging of defects using our spectrometer.The results reported in his thesis on the above in addition to studies on In2S3 and transparent conducting oxide ZnO have been achieved with this spectrometer. Various polymer samples have been easily analysed for their thermal diffusivities. The technique provided ease of analysis not achieved with conventional techniques like TGA and DSC. Industrial application of the tool has also been proved by analyzing defects of welded joints and adhesion of paints. Indigenization of the expensive lock-in-amplifier and automation has been the significant achievement in the course of this dissertation. We are on our way to prove the noise rejection capabilities of our PC LIA.
Resumo:
ACCURATE sensing of vehicle position and attitude is still a very challenging problem in many mobile robot applications. The mobile robot vehicle applications must have some means of estimating where they are and in which direction they are heading. Many existing indoor positioning systems are limited in workspace and robustness because they require clear lines-of-sight or do not provide absolute, driftfree measurements.The research work presented in this dissertation provides a new approach to position and attitude sensing system designed specifically to meet the challenges of operation in a realistic, cluttered indoor environment, such as that of an office building, hospital, industrial or warehouse. This is accomplished by an innovative assembly of infrared LED source that restricts the spreading of the light intensity distribution confined to a sheet of light and is encoded with localization and traffic information. This Digital Infrared Sheet of Light Beacon (DISLiB) developed for mobile robot is a high resolution absolute localization system which is simple, fast, accurate and robust, without much of computational burden or significant processing. Most of the available beacon's performance in corridors and narrow passages are not satisfactory, whereas the performance of DISLiB is very encouraging in such situations. This research overcomes most of the inherent limitations of existing systems.The work further examines the odometric localization errors caused by over count readings of an optical encoder based odometric system in a mobile robot due to wheel-slippage and terrain irregularities. A simple and efficient method is investigated and realized using an FPGA for reducing the errors. The detection and correction is based on redundant encoder measurements. The method suggested relies on the fact that the wheel slippage or terrain irregularities cause more count readings from the encoder than what corresponds to the actual distance travelled by the vehicle.The application of encoded Digital Infrared Sheet of Light Beacon (DISLiB) system can be extended to intelligent control of the public transportation system. The system is capable of receiving traffic status input through a GSM (Global System Mobile) modem. The vehicles have infrared receivers and processors capable of decoding the information, and generating the audio and video messages to assist the driver. The thesis further examines the usefulness of the technique to assist the movement of differently-able (blind) persons in indoor or outdoor premises of his residence.The work addressed in this thesis suggests a new way forward in the development of autonomous robotics and guidance systems. However, this work can be easily extended to many other challenging domains, as well.
Resumo:
Hevea latex is a natural biological liquid of very complex composition .Besides rubber hydrocarbons,it contains many proteinous and resinous substances,carbohydrates,inorganic matter,water,and others.The Dry Rubber Content (DRC) of latex varies according to season, tapping system,weather,soil conditions ,clone,age of the tree etc. The true DRC of the latex must be determined to ensure fair prices for the latex during commercial exchange.The DRC of Hevea latex is a very familiar term to all in the rubber industry.It has been the basis for incentive payments to tappers who bring in more than the daily agreed poundage of latex.It is an important parameter for rubber and latex processing industries for automation and verious decesion making processes.This thesis embodies the efforts made by me to determine the DRC of rubber latex following different analytical tools such as MIR absorption,thermal analysis.dielectric spectroscopy and NIR reflectance.The rubber industry is still Looking for a compact instrument that is accurate economical,easy to use and environment friendly.I hope the results presented in this thesis will help to realise this goal in the near future.
Resumo:
Information is knowledge, facts or data. For the purpose of enabling the users to assimilate information, it should be repacked. Knowledge becomes information when it is externalized i.e. put in to the process of communication. The effectiveness of communication technology depends how well it provides its clients with information rapidly, economically and authentically. A large number of ICT enabled services including OPAC; e-resources etc. are available in the university library. Studies have been done to find the impact of ICT on different sections of CUSAT library by observing the activities of different sections; discussions with colleagues and visitors; and analyzing the entries in the library records. The results of the studies are presented here in the form of a paper.
Resumo:
The Central Library of Cochin University of Science and Technology (CUSAT) has been automated by proprietary software (Adlib Library) since 2000. After 11 years, in 2011, the university authorities decided to shift to an open source software (OSS), for integrated library management system (ILMS), Koha for automating the library housekeeping operations. In this context, this study attempts to share the experiences in cataloging with both type of software. The features of the cataloging modules of both the software are analysed on the badis of certain check points. It is found that the cataloging module of Koha is almost in par with that of proven proprietary software that has been in market for the past 25 years. Some suggestions made by this study may be incorporated for the further development and perfection of Koha.
Resumo:
The paper discusses the use of online information resources for organising knowledge in library and information centres in Cochin University of Science and Technology (CUSAT). The paper discusses the status and extent of automation in CUSAT library. The use of different online resources and the purposes for which these resources are being used, is explained in detail. Structured interview method was applied for collecting data. It was observed that 67 per cent users consult online resources for assisting knowledge organisation. Library of Congress catalogue is the widely used (100 per cent) online resource followed by OPAC of CUSAT and catalogue of British Library. The main purposes for using these resources are class number building and subject indexing
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.
Resumo:
The restarting automaton is a restricted model of computation that was introduced by Jancar et al. to model the so-called analysis by reduction, which is a technique used in linguistics to analyze sentences of natural languages. The most general models of restarting automata make use of auxiliary symbols in their rewrite operations, although this ability does not directly correspond to any aspect of the analysis by reduction. Here we put restrictions on the way in which restarting automata use auxiliary symbols, and we investigate the influence of these restrictions on their expressive power. In fact, we consider two types of restrictions. First, we consider the number of auxiliary symbols in the tape alphabet of a restarting automaton as a measure of its descriptional complexity. Secondly, we consider the number of occurrences of auxiliary symbols on the tape as a dynamic complexity measure. We establish some lower and upper bounds with respect to these complexity measures concerning the ability of restarting automata to recognize the (deterministic) context-free languages and some of their subclasses.
Resumo:
Humans can effortlessly manipulate objects in their hands, dexterously sliding and twisting them within their grasp. Robots, however, have none of these capabilities, they simply grasp objects rigidly in their end effectors. To investigate this common form of human manipulation, an analysis of controlled slipping of a grasped object within a robot hand was performed. The Salisbury robot hand demonstrated many of these controlled slipping techniques, illustrating many results of this analysis. First, the possible slipping motions were found as a function of the location, orientation, and types of contact between the hand and object. Second, for a given grasp, the contact types were determined as a function of the grasping force and the external forces on the object. Finally, by changing the grasping force, the robot modified the constraints on the object and affect controlled slipping slipping motions.
Resumo:
The control of aerial gymnastic maneuvers is challenging because these maneuvers frequently involve complex rotational motion and because the performer has limited control of the maneuver during flight. A performer can influence a maneuver using a sequence of limb movements during flight. However, the same sequence may not produce reliable performances in the presence of off-nominal conditions. How do people compensate for variations in performance to reliably produce aerial maneuvers? In this report I explore the role that passive dynamic stability may play in making the performance of aerial maneuvers simple and reliable. I present a control strategy comprised of active and passive components for performing robot front somersaults in the laboratory. I show that passive dynamics can neutrally stabilize the layout somersault which involves an "inherently unstable" rotation about the intermediate principal axis. And I show that a strategy that uses open loop joint torques plus passive dynamics leads to more reliable 1 1/2 twisting front somersaults in simulation than a strategy that uses prescribed limb motion. Results are presented from laboratory experiments on gymnastic robots, from dynamic simulation of humans and robots, and from linear stability analyses of these systems.
Resumo:
Freehand sketching is both a natural and crucial part of design, yet is unsupported by current design automation software. We are working to combine the flexibility and ease of use of paper and pencil with the processing power of a computer to produce a design environment that feels as natural as paper, yet is considerably smarter. One of the most basic steps in accomplishing this is converting the original digitized pen strokes in the sketch into the intended geometric objects using feature point detection and approximation. We demonstrate how multiple sources of information can be combined for feature detection in strokes and apply this technique using two approaches to signal processing, one using simple average based thresholding and a second using scale space.
Resumo:
The transformation from high level task specification to low level motion control is a fundamental issue in sensorimotor control in animals and robots. This thesis develops a control scheme called virtual model control which addresses this issue. Virtual model control is a motion control language which uses simulations of imagined mechanical components to create forces, which are applied through joint torques, thereby creating the illusion that the components are connected to the robot. Due to the intuitive nature of this technique, designing a virtual model controller requires the same skills as designing the mechanism itself. A high level control system can be cascaded with the low level virtual model controller to modulate the parameters of the virtual mechanisms. Discrete commands from the high level controller would then result in fluid motion. An extension of Gardner's Partitioned Actuator Set Control method is developed. This method allows for the specification of constraints on the generalized forces which each serial path of a parallel mechanism can apply. Virtual model control has been applied to a bipedal walking robot. A simple algorithm utilizing a simple set of virtual components has successfully compelled the robot to walk eight consecutive steps.
Resumo:
Since robots are typically designed with an individual actuator at each joint, the control of these systems is often difficult and non-intuitive. This thesis explains a more intuitive control scheme called Virtual Model Control. This thesis also demonstrates the simplicity and ease of this control method by using it to control a simulated walking hexapod. Virtual Model Control uses imagined mechanical components to create virtual forces, which are applied through the joint torques of real actuators. This method produces a straightforward means of controlling joint torques to produce a desired robot behavior. Due to the intuitive nature of this control scheme, the design of a virtual model controller is similar to the design of a controller with basic mechanical components. The ease of this control scheme facilitates the use of a high level control system which can be used above the low level virtual model controllers to modulate the parameters of the imaginary mechanical components. In order to apply Virtual Model Control to parallel mechanisms, a solution to the force distribution problem is required. This thesis uses an extension of Gardner`s Partitioned Force Control method which allows for the specification of constrained degrees of freedom. This virtual model control technique was applied to a simulated hexapod robot. Although the hexapod is a highly non-linear, parallel mechanism, the virtual models allowed text-book control solutions to be used while the robot was walking. Using a simple linear control law, the robot walked while simultaneously balancing a pendulum and tracking an object.