987 resultados para automation roadmap
Resumo:
Gap analysis is a very useful tool for firms contemplating adoption of a new system. This paper envisages the use of the gap analysis tool as a precursor for Indian seafood exporting firms to adopt total quality management. Total quality management (TQM) is a management philosophy which strives to put quality at the forefront of all its decision-making, thereby satisfying customers. We therefore recommend that firms adopt the TQM system for better alignment of management goals. The gap analysis tool serves as a roadmap for TQM implementation, by showing the management where they actually are and where they want to be. The main gaps were found to be in the areas of usage of SPC tools (66.7%), benchmarking (65.6%), top management commitment (56.25%) and customer focus (48.1%).
Resumo:
The telemetry data processing operation intended for a given mission are pre-defined by an onboard telemetry configuration, mission trajectory and overall telemetry methodology have stabilized lately for ISRO vehicles. The given problem on telemetry data processing is reduced through hierarchical problem reduction whereby the sequencing of operations evolves as the control task and operations on data as the function task. The function task Input, Output and execution criteria are captured into tables which are examined by the control task and then schedules when the function task when the criteria is being met.
Resumo:
Information is knowledge, facts or data. For the purpose of enabling the users to assimilate information, it should be repacked. Knowledge becomes information when it is externalized i.e. put in to the process of communication. The effectiveness of communication technology depends how well it provides its clients with information rapidly, economically and authentically. A large number of ICT enabled services including OPAC; e-resources etc. are available in the university library. Studies have been done to find the impact of ICT on different sections of CUSAT library by observing the activities of different sections; discussions with colleagues and visitors; and analyzing the entries in the library records. The results of the studies are presented here in the form of a paper.
Resumo:
The Central Library of Cochin University of Science and Technology (CUSAT) has been automated by proprietary software (Adlib Library) since 2000. After 11 years, in 2011, the university authorities decided to shift to an open source software (OSS), for integrated library management system (ILMS), Koha for automating the library housekeeping operations. In this context, this study attempts to share the experiences in cataloging with both type of software. The features of the cataloging modules of both the software are analysed on the badis of certain check points. It is found that the cataloging module of Koha is almost in par with that of proven proprietary software that has been in market for the past 25 years. Some suggestions made by this study may be incorporated for the further development and perfection of Koha.
Resumo:
The paper discusses the use of online information resources for organising knowledge in library and information centres in Cochin University of Science and Technology (CUSAT). The paper discusses the status and extent of automation in CUSAT library. The use of different online resources and the purposes for which these resources are being used, is explained in detail. Structured interview method was applied for collecting data. It was observed that 67 per cent users consult online resources for assisting knowledge organisation. Library of Congress catalogue is the widely used (100 per cent) online resource followed by OPAC of CUSAT and catalogue of British Library. The main purposes for using these resources are class number building and subject indexing
Resumo:
This paper introduces a simple and efficient method and its implementation in an FPGA for reducing the odometric localization errors caused by over count readings of an optical encoder based odometric system in a mobile robot due to wheel-slippage and terrain irregularities. The detection and correction is based on redundant encoder measurements. The method suggested relies on the fact that the wheel slippage or terrain irregularities cause more count readings from the encoder than what corresponds to the actual distance travelled by the vehicle. The standard quadrature technique is used to obtain four counts in each encoder period. In this work a three-wheeled mobile robot vehicle with one driving-steering wheel and two-fixed rear wheels in-axis, fitted with incremental optical encoders is considered. The CORDIC algorithm has been used for the computation of sine and cosine terms in the update equations. The results presented demonstrate the effectiveness of the technique
Resumo:
A new localization approach to increase the navigational capabilities and object manipulation of autonomous mobile robots, based on an encoded infrared sheet of light beacon system, which provides position errors smaller than 0.02m is presented in this paper. To achieve this minimal position error, a resolution enhancement technique has been developed by utilising an inbuilt odometric/optical flow sensor information. This system respects strong low cost constraints by using an innovative assembly for the digitally encoded infrared transmitter. For better guidance of mobile robot vehicles, an online traffic signalling capability is also incorporated. Other added features are its less computational complexity and online localization capability all these without any estimation uncertainty. The constructional details, experimental results and computational methodologies of the system are also described
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.
Resumo:
The restarting automaton is a restricted model of computation that was introduced by Jancar et al. to model the so-called analysis by reduction, which is a technique used in linguistics to analyze sentences of natural languages. The most general models of restarting automata make use of auxiliary symbols in their rewrite operations, although this ability does not directly correspond to any aspect of the analysis by reduction. Here we put restrictions on the way in which restarting automata use auxiliary symbols, and we investigate the influence of these restrictions on their expressive power. In fact, we consider two types of restrictions. First, we consider the number of auxiliary symbols in the tape alphabet of a restarting automaton as a measure of its descriptional complexity. Secondly, we consider the number of occurrences of auxiliary symbols on the tape as a dynamic complexity measure. We establish some lower and upper bounds with respect to these complexity measures concerning the ability of restarting automata to recognize the (deterministic) context-free languages and some of their subclasses.
Resumo:
Tagungsband - Vorträge vom Automation Symposium 2006
Resumo:
Vorträge / Präsentationen des Automation Symposiums 2008
Resumo:
Ultrafast laser pulses have become an integral part of the toolbox of countless laboratories doing physics, chemistry, and biological research. The work presented here is motivated by a section in the ever-growing, interdisciplinary research towards understanding the fundamental workings of light-matter interactions. Specifically, attosecond pulses can be useful tools to obtain the desired insight. However access to, and the utility of, such pulses is dependent on the generation of intense, few-cycle, carrier-envelope-phase stabilized laser pulses. The presented work can be thought of as a sort of roadmap towards the latter. From the oscillator which provides the broadband seed to amplification methods, the integral pieces necessary for the generation of attosecond pulses are discussed. A range of topics from the fundamentals to design challenges is presented, outfitting the way towards the practical implementation of an intense few-cycle carrier-envelope-phase stabilized laser source.
Resumo:
Freehand sketching is both a natural and crucial part of design, yet is unsupported by current design automation software. We are working to combine the flexibility and ease of use of paper and pencil with the processing power of a computer to produce a design environment that feels as natural as paper, yet is considerably smarter. One of the most basic steps in accomplishing this is converting the original digitized pen strokes in the sketch into the intended geometric objects using feature point detection and approximation. We demonstrate how multiple sources of information can be combined for feature detection in strokes and apply this technique using two approaches to signal processing, one using simple average based thresholding and a second using scale space.
Resumo:
The Manufacturing Systems team was one of the research teams within the Lean Aerospace Initiative (LAI) whose goal was to document, analyze and communicate the design attributes and relationships that lead to significant performance improvements in manufacturing systems in the defense aerospace industry. This report will provide an integrated record of this research using the Production Operations Transition to Lean Roadmap as its organizing framework.
Resumo:
Implementació d’una aplicació web per a l’àrea esportiva de Blanes (Girona