36 resultados para Language acquisition.

em Indian Institute of Science - Bangalore - Índia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The broader goal of the research being described here is to automatically acquire diagnostic knowledge from documents in the domain of manual and mechanical assembly of aircraft structures. These documents are treated as a discourse used by experts to communicate with others. It therefore becomes possible to use discourse analysis to enable machine understanding of the text. The research challenge addressed in the paper is to identify documents or sections of documents that are potential sources of knowledge. In a subsequent step, domain knowledge will be extracted from these segments. The segmentation task requires partitioning the document into relevant segments and understanding the context of each segment. In discourse analysis, the division of a discourse into various segments is achieved through certain indicative clauses called cue phrases that indicate changes in the discourse context. However, in formal documents such language may not be used. Hence the use of a domain specific ontology and an assembly process model is proposed to segregate chunks of the text based on a local context. Elements of the ontology/model, and their related terms serve as indicators of current context for a segment and changes in context between segments. Local contexts are aggregated for increasingly larger segments to identify if the document (or portions of it) pertains to the topic of interest, namely, assembly. Knowledge acquired through such processes enables acquisition and reuse of knowledge during any part of the lifecycle of a product.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the design and implementation of a high-level query language called Generalized Query-By-Rule (GQBR) which supports retrieval, insertion, deletion and update operations. This language, based on the formalism of database logic, enables the users to access each database in a distributed heterogeneous environment, without having to learn all the different data manipulation languages. The compiler has been implemented on a DEC 1090 system in Pascal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Database management systems offer a very reliable and attractive data organization for fast and economical information storage and processing for diverse applications. It is much more important that the information should be easily accessible to users with varied backgrounds, professional as well as casual, through a suitable data sublanguage. The language adopted here (APPLE) is one such language for relational database systems and is completely nonprocedural and well suited to users with minimum or no programming background. This is supported by an access path model which permits the user to formulate completely nonprocedural queries expressed solely in terms of attribute names. The data description language (DDL) and data manipulation language (DML) features of APPLE are also discussed. The underlying relational database has been implemented with the help of the DATATRIEVE-11 utility for record and domain definition which is available on the PDP-11/35. The package is coded in Pascal and MACRO-11. Further, most of the limitations of the DATATRIEVE-11 utility have been eliminated in the interface package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An applicative language based on the LAMBDA-Calculus is presented. The language, SLIPS (Small Language for Instruction Purposes), is described using the LAMBDA-Calculus as a metalanguage. A call-by-need mechanism of function invocation eliminates the drawbacks of both call-by-name and call-by-value. The system has been implemented in PASCAL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many novel computer architectures like array and multiprocessors which achieve high performance through the use of concurrency exploit variations of the von Neumann model of computation. The effective utilization of the machines makes special demands on programmers and their programming languages, such as the structuring of data into vectors or the partitioning of programs into concurrent processes. In comparison, the data flow model of computation demands only that the principle of structured programming be followed. A data flow program, often represented as a data flow graph, is a program that expresses a computation by indicating the data dependencies among operators. A data flow computer is a machine designed to take advantage of concurrency in data flow graphs by executing data independent operations in parallel. In this paper, we discuss the design of a high level language (DFL: Data Flow Language) suitable for data flow computers. Some sample procedures in DFL are presented. The implementation aspects have not been discussed in detail since there are no new problems encountered. The language DFL embodies the concepts of functional programming, but in appearance closely resembles Pascal. The language is a better vehicle than the data flow graph for expressing a parallel algorithm. The compiler has been implemented on a DEC 1090 system in Pascal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a low cost but high resolution retinal image acquisition system of the human eye. The images acquired by a CMOS image sensor are communicated through the Universal Serial Bus (USB) interface to a personal computer for viewing and further processing. The image acquisition time was estimated to be 2.5 seconds. This system can also be used in telemedicine applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This correspondence describes a method for automated segmentation of speech. The method proposed in this paper uses a specially designed filter-bank called Bach filter-bank which makes use of 'music' related perception criteria. The speech signal is treated as continuously time varying signal as against a short time stationary model. A comparative study has been made of the performances using Mel, Bark and Bach scale filter banks. The preliminary results show up to 80 % matches within 20 ms of the manually segmented data, without any information of the content of the text and without any language dependence. The Bach filters are seen to marginally outperform the other filters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sinusoidal structured light projection (SSLP) technique, specifically-phase stepping method, is in widespread use to obtain accurate, dense 3-D data. But, if the object under investigation possesses surface discontinuities, phase unwrapping (an intermediate step in SSLP) stage mandatorily require several additional images, of the object with projected fringes (of different spatial frequencies), as input to generate a reliable 3D shape. On the other hand, Color-coded structured light projection (CSLP) technique is known to require a single image as in put, but generates sparse 3D data. Thus we propose the use of CSLP in conjunction with SSLP to obtain dense 3D data with minimum number of images as input. This approach is shown to be significantly faster and reliable than temporal phase unwrapping procedure that uses a complete exponential sequence. For example, if a measurement with the accuracy obtained by interrogating the object with 32 fringes in the projected pattern is carried out with both the methods, new strategy proposed requires only 5 frames as compared to 24 frames required by the later method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new language concept for high-level distributed programming is proposed. Programs are organised as a collection of concurrently executing processes. Some of these processes, referred to as liaison processes, have a monitor-like structure and contain ports which may be invoked by other processes for the purposes of synchronisation and communication. Synchronisation is achieved by conditional activation of ports and also through port control constructs which may directly specify the execution ordering of ports. These constructs implement a path-expression-like mechanism for synchronisation and are also equipped with options to provide conditional, non-deterministic and priority ordering of ports. The usefulness and expressive power of the proposed concepts are illustrated through solutions of several representative programming problems. Some implementation issues are also considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Indian logic has a long history. It somewhat covers the domains of two of the six schools (darsanas) of Indian philosophy, namely, Nyaya and Vaisesika. The generally accepted definition of Indian logic over the ages is the science which ascertains valid knowledge either by means of six senses or by means of the five members of the syllogism. In other words, perception and inference constitute the subject matter of logic. The science of logic evolved in India through three ages: the ancient, the medieval and the modern, spanning almost thirty centuries. Advances in Computer Science, in particular, in Artificial Intelligence have got researchers in these areas interested in the basic problems of language, logic and cognition in the past three decades. In the 1980s, Artificial Intelligence has evolved into knowledge-based and intelligent system design, and the knowledge base and inference engine have become standard subsystems of an intelligent system. One of the important issues in the design of such systems is knowledge acquisition from humans who are experts in a branch of learning (such as medicine or law) and transferring that knowledge to a computing system. The second important issue in such systems is the validation of the knowledge base of the system i.e. ensuring that the knowledge is complete and consistent. It is in this context that comparative study of Indian logic with recent theories of logic, language and knowledge engineering will help the computer scientist understand the deeper implications of the terms and concepts he is currently using and attempting to develop.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is possible to sample signals at sub-Nyquist rate and still be able to reconstruct them with reasonable accuracy provided they exhibit local Fourier sparsity. Underdetermined systems of equations, which arise out of undersampling, have been solved to yield sparse solutions using compressed sensing algorithms. In this paper, we propose a framework for real time sampling of multiple analog channels with a single A/D converter achieving higher effective sampling rate. Signal reconstruction from noisy measurements on two different synthetic signals has been presented. A scheme of implementing the algorithm in hardware has also been suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tower platforms, with instrumentation at six levels above the surface to a height of 30 m, were used to record various atmospheric parameters in the surface layer. Sensors for measuring both mean and fluctuating quantities were used, with the majority of them indigenously built. Soil temperature sensors up to a depth of 30 cm from the surface were among the variables connected to the mean data logger. A PC-based data acquisition system built at the Centre for Atmospheric Sciences, IISc, was used to acquire the data from fast response sensors. This paper reports the various components of a typical MONTBLEX tower observatory and describes the actual experiments carried out in the surface layer at four sites over the monsoon trough region as a part of the MONTBLEX programme. It also describes and discusses several checks made on randomly selected tower data-sets acquired during the experiment. Checks made include visual inspection of time traces from various sensors, comparative plots of sensors measuring the same variable, wind and temperature profile plots calculation of roughness lengths, statistical and stability parameters, diurnal variation of stability parameters, and plots of probability density and energy spectrum for the different sensors. Results from these checks are found to be very encouraging and reveal the potential for further detailed analysis to understand more about surface layer characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytical studies are carried out to minimize acquisition time in phase-lock loop (PLL) applications using aiding functions. A second order aided PLL is realized with the help of the quasi-stationary approach to verify the acquisition behavior in the absence of noise. Time acquisition is measured both from the study of the LPF output transient and by employing a lock detecting and indicating circuit to crosscheck experimental and analytical results. A closed form solution is obtained for the evaluation of the time acquisition using different aiding functions. The aiding signal is simple and economical and can be used with state of the art hardware.