27 resultados para Machine translation system
em Aston University Research Archive
Resumo:
For more than forty years, research has been on going in the use of the computer in the processing of natural language. During this period methods have evolved, with various parsing techniques and grammars coming to prominence. Problems still exist, not least in the field of Machine Translation. However, one of the successes in this field is the translation of sublanguage. The present work reports Deterministic Parsing, a relatively new parsing technique, and its application to the sublanguage of an aircraft maintenance manual for Machine Translation. The aim has been to investigate the practicability of using Deterministic Parsers in the analysis stage of a Machine Translation system. Machine Translation, Sublanguage and parsing are described in general terms with a review of Deterministic parsing systems, pertinent to this research, being presented in detail. The interaction between machine Translation, Sublanguage and Parsing, including Deterministic parsing, is also highlighted. Two types of Deterministic Parser have been investigated, a Marcus-type parser, based on the basic design of the original Deterministic parser (Marcus, 1980) and an LR-type Deterministic Parser for natural language, based on the LR parsing algorithm. In total, four Deterministic Parsers have been built and are described in the thesis. Two of the Deterministic Parsers are prototypes from which the remaining two parsers to be used on sublanguage have been developed. This thesis reports the results of parsing by the prototypes, a Marcus-type parser and an LR-type parser which have a similar grammatical and linguistic range to the original Marcus parser. The Marcus-type parser uses a grammar of production rules, whereas the LR-type parser employs a Definite Clause Grammar(DGC).
Resumo:
Information technology has increased both the speed and medium of communication between nations. It has brought the world closer, but it has also created new challenges for translation — how we think about it, how we carry it out and how we teach it. Translation and Information Technology has brought together experts in computational linguistics, machine translation, translation education, and translation studies to discuss how these new technologies work, the effect of electronic tools, such as the internet, bilingual corpora, and computer software, on translator education and the practice of translation, as well as the conceptual gaps raised by the interface of human and machine.
Resumo:
Yorick Wilks is a central figure in the fields of Natural Language Processing and Artificial Intelligence. His influence extends to many areas and includes contributions to Machines Translation, word sense disambiguation, dialogue modeling and Information Extraction. This book celebrates the work of Yorick Wilks in the form of a selection of his papers which are intended to reflect the range and depth of his work. The volume accompanies a Festschrift which celebrates his contribution to the fields of Computational Linguistics and Artificial Intelligence. The papers include early work carried out at Cambridge University, descriptions of groundbreaking work on Machine Translation and Preference Semantics as well as more recent works on belief modeling and computational semantics. The selected papers reflect Yorick’s contribution to both practical and theoretical aspects of automatic language processing.
Resumo:
Yorick Wilks is a central figure in the fields of Natural Language Processing and Artificial Intelligence. His influence has extends to many areas of these fields and includes contributions to Machine Translation, word sense disambiguation, dialogue modeling and Information Extraction.This book celebrates the work of Yorick Wilks from the perspective of his peers. It consists of original chapters each of which analyses an aspect of his work and links it to current thinking in that area. His work has spanned over four decades but is shown to be pertinent to recent developments in language processing such as the Semantic Web.This volume forms a two-part set together with Words and Intelligence I, Selected Works by Yorick Wilks, by the same editors.
Resumo:
It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
This paper will seek to explicate the changes in the New Zealand health sector informed by the concepts of problematization, inscription and the construction of networks (Callon, 1986; Latour, 1987, 1993). This will involve applying a framework of interpretation based on the concepts of Latour's sociology of translation. Material on problematization and inscription will be incorporated into the paper in order to provide an explanatory frame of reference which will enable us to make sense of the processes of change in the New Zealand health sector. The sociology of translation will be used to explain the processes which underlie the changes and will be used to capture effects, such as changes in policy and structure, producing new networks within which 'allies' could be enrolled in support of the health reforms.
Resumo:
Linear typing schemes can be used to guarantee non-interference and so the soundness of in-place update with respect to a functional semantics. But linear schemes are restrictive in practice, and more restrictive than necessary to guarantee soundness of in-place update. This limitation has prompted research into static analysis and more sophisticated typing disciplines to determine when in-place update may be safely used, or to combine linear and non-linear schemes. Here we contribute to this direction by defining a new typing scheme that better approximates the semantic property of soundness of in-place update for a functional semantics. We begin from the observation that some data are used only in a read-only context, after which it may be safely re-used before being destroyed. Formalising the in-place update interpretation in a machine model semantics allows us to refine this observation, motivating three usage aspects apparent from the semantics that are used to annotate function argument types. The aspects are (1) used destructively, (2), used read-only but shared with result, and (3) used read-only and not shared with the result. The main novelty is aspect (2), which allows a linear value to be safely read and even aliased with a result of a function without being consumed. This novelty makes our type system more expressive than previous systems for functional languages in the literature. The system remains simple and intuitive, but it enjoys a strong soundness property whose proof is non-trivial. Moreover, our analysis features principal types and feasible type reconstruction, as shown in M. Konen'y (In TYPES 2002 workshop, Nijmegen, Proceedings, Springer-Verlag, 2003).
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
Resumo:
Traditional high speed machinery actuators are powered and coordinated by mechanical linkages driven from a central drive, but these linkages may be replaced by independently synchronised electric drives. Problems associated with utilising such electric drives for this form of machinery were investigated. The research concentrated on a high speed rod-making machine, which required control of high inertias (0.01-0.5kgm2), at continuous high speed (2500 r/min), with low relative phase errors between two drives (0.0025 radians). Traditional minimum energy drive selection techniques for incremental motions were not applicable to continuous applications which require negligible energy dissipation. New selection techniques were developed. A brushless configuration constant enabled the comparison between seven different servo systems; the rate earth brushless drives had the best power rates which is a performance measure. Simulation was used to review control strategies, such that a microprocessor controller with a proportional velocity loop within a proportional position loop with velocity feedforward was designed. Local control schemes were investigated as means of reducing relative errors between drives: the slave of a master/slave scheme compensates for the master's errors: the matched scheme has drives with similar absolute errors so the relative error is minimised, and the feedforward scheme minimises error by adding compensation from previous knowledge. Simulation gave an approximate velocity loop bandwidth and position loop gain required to meet the specification. Theoretical limits for these parameters were defined in terms of digital sampling delays, quantisation, and system phase shifts. Performance degradation due to mechanical backlash was evaluated. Thus any drive could be checked to ensure that the performance specification could be realised. A two drive demonstrator was commissioned with 0.01kgm2 loads. By use of simulation the performance of one drive was improved by increasing the velocity loop bandwidth fourfold. With the master/slave scheme relative errors were within 0.0024 radians at a constant 2500 r/min for two 0.01 kgm^2 loads.