803 resultados para Codes and repertoires of language
Resumo:
The main contribution of this work is to analyze and describe the state of the art performance as regards answer scoring systems from the SemEval- 2013 task, as well as to continue with the development of an answer scoring system (EHU-ALM) developed in the University of the Basque Country. On the overall this master thesis focuses on finding any possible configuration that lets improve the results in the SemEval dataset by using attribute engineering techniques in order to find optimal feature subsets, along with trying different hierarchical configurations in order to analyze its performance against the traditional one versus all approach. Altogether, throughout the work we propose two alternative strategies: on the one hand, to improve the EHU-ALM system without changing the architecture, and, on the other hand, to improve the system adapting it to an hierarchical con- figuration. To build such new models we describe and use distinct attribute engineering, data preprocessing, and machine learning techniques.
Resumo:
This paper describes work performed as part of the U.K. Alvey sponsored Voice Operated Database Inquiry System (VODIS) project in the area of intelligent dialogue control. The principal aims of the work were to develop a habitable interface for the untrained user; to investigate the degree to which dialogue control can be used to compensate for deficiencies in recognition performance; and to examine the requirements on dialogue control for generating natural speech output. A data-driven methodology is described based on the use of frames in which dialogue topics are organized hierarchically. The concept of a dynamically adjustable scope is introduced to permit adaptation to recognizer performance and the use of historical and hierarchical contexts are described to facilitate the construction of contextually relevant output messages. © 1989.
Resumo:
This paper describes the development of an automated design optimization system that makes use of a high fidelity Reynolds-Averaged CFD analysis procedure to minimize the fan forcing and fan BOGV (bypass outlet guide vane) losses simultaneously taking into the account the down-stream pylon and RDF (radial drive fairing) distortions. The design space consists of the OGV's stagger angle, trailing-edge recambering, axial and circumferential positions leading to a variable pitch optimum design. An advanced optimization system called SOFT (Smart Optimisation for Turbomachinery) was used to integrate a number of pre-processor, simulation and in-house grid generation codes and postprocessor programs. A number of multi-objective, multi-point optimiztion were carried out by SOFT on a cluster of workstations and are reported herein.
Resumo:
More than ten bradykinin-related peptides and their cDNAs; have been identified from amphibians, but their genes are unknown. In present study, four cDNAs encoding one, two, four and six copies of bradykinin-related peptides were cloned from the frog (Odorrana grahami) skin cDNA library, respectively. Three bradykinin-related peptides (bradykinin, Thr6-bradykinin, Leu5Thr6-bradykinin) were deduced from these four cDNA sequences. Based on the cDNA sequence, the gene sequence encoding an amphibian bradykinin-related peptide from O. grahami was determined. It is composed of 7481 base pairs including two exons and two introns. The first exon codes signal peptide and the second exon codes acidic spacer peptide and Thr6-bradykinin. The promoter region of the bradykinin gene contains several putative recognition sites for nuclear factors, such as SRY, GATA-1, LYF-1, DeltaE, CDXA, NKX-2.5, MIF1 and S8. The current work may facilitate to understand the regulation and possible functions of amphibian skin bradykinin-related peptides. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
Motivated by the design and development challenges of the BART case study, an approach for developing and analyzing a formal model for reactive systems is presented. The approach makes use of a domain specific language for specifying control algorithms able to satisfy competing properties such as safety and optimality. The domain language, called SPC, offers several key abstractions such as the state, the profile, and the constraint to facilitate problem specification. Using a high-level program transformation system such as HATS being developed at the University of Nebraska at Omaha, specifications in this modelling language can be transformed to ML code. The resulting executable specification can be further refined by applying generic transformations to the abstractions provided by the domain language. Problem dependent transformations utilizing the domain specific knowledge and properties may also be applied. The result is a significantly more efficient implementation which can be used for simulation and gaining deeper insight into design decisions and various control policies. The correctness of transformations can be established using a rewrite-rule based induction theorem prover Rewrite Rule Laboratory developed at the University of New Mexico.
Resumo:
lThe study was supported by the Knowledge Innovation Foundation of the institute of Geographical Sciences and Natural Resource, Chinese Academy of Sciences (Grant No. 200906002) and Key Directional Project of Knowledge Innovation of Chinese Academy of Sciences (Grant No. KSCX2-YW-N-46-01). The authors would like to thank to Luke Driskell for his kind help and hard work on English language polishing of the article.
Resumo:
Does knowledge of language consist of symbolic rules? How do children learn and use their linguistic knowledge? To elucidate these questions, we present a computational model that acquires phonological knowledge from a corpus of common English nouns and verbs. In our model the phonological knowledge is encapsulated as boolean constraints operating on classical linguistic representations of speech sounds in term of distinctive features. The learning algorithm compiles a corpus of words into increasingly sophisticated constraints. The algorithm is incremental, greedy, and fast. It yields one-shot learning of phonological constraints from a few examples. Our system exhibits behavior similar to that of young children learning phonological knowledge. As a bonus the constraints can be interpreted as classical linguistic rules. The computational model can be implemented by a surprisingly simple hardware mechanism. Our mechanism also sheds light on a fundamental AI question: How are signals related to symbols?
Resumo:
An approach towards shape description, based on prototype modification and generalized cylinders, has been developed and applied to the object domains pottery and polyhedra: (1) A program describes and identifies pottery from vase outlines entered as lists of points. The descriptions have been modeled after descriptions by archeologists, with the result that identifications made by the program are remarkably consisten with those of the archeologists. It has been possible to quantify their shape descriptors, which are everyday terms in our language applied to many sorts of objects besides pottery, so that the resulting descriptions seem very natural. (2) New parsing strategies for polyhedra overcome some limitations of previous work. A special feature is that the processes of parsing and identification are carried out simultaneously.
Resumo:
Act2 is a highly concurrent programming language designed to exploit the processing power available from parallel computer architectures. The language supports advanced concepts in software engineering, providing high-level constructs suitable for implementing artificially-intelligent applications. Act2 is based on the Actor model of computation, consisting of virtual computational agents which communicate by message-passing. Act2 serves as a framework in which to integrate an actor language, a description and reasoning system, and a problem-solving and resource management system. This document describes issues in Act2's design and the implementation of an interpreter for the language.
Resumo:
"The Structure and Interpretation of Computer Programs" is the entry-level subject in Computer Science at the Massachusetts Institute of Technology. It is required of all students at MIT who major in Electrical Engineering or in Computer Science, as one fourth of the "common core curriculum," which also includes two subjects on circuits and linear systems and a subject on the design of digital systems. We have been involved in the development of this subject since 1978, and we have taught this material in its present form since the fall of 1980 to approximately 600 students each year. Most of these students have had little or no prior formal training in computation, although most have played with computers a bit and a few have had extensive programming or hardware design experience. Our design of this introductory Computer Science subject reflects two major concerns. First we want to establish the idea that a computer language is not just a way of getting a computer to perform operations, but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Secondly, we believe that the essential material to be addressed by a subject at this level, is not the syntax of particular programming language constructs, nor clever algorithms for computing particular functions of efficiently, not even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems.
Resumo:
Marggraf Turley, R. (2002). The Politics of Language in Romantic Literature. Basingstoke: Palgrave Macmillan. RAE2008
Resumo:
J. A. Gallagher, A. J. Cairns and C. J. Pollock (2004). Cloning and characterization of a putative fructosyltransferase and two putative invertase genes from the temperate grass Lolium temulentum L. Journal of Experimental Botany, 55 (397) pp.557-569 Sponsorship: BBSRC RAE2008
Resumo:
As the commoditization of sensing, actuation and communication hardware increases, so does the potential for dynamically tasked sense and respond networked systems (i.e., Sensor Networks or SNs) to replace existing disjoint and inflexible special-purpose deployments (closed-circuit security video, anti-theft sensors, etc.). While various solutions have emerged to many individual SN-centric challenges (e.g., power management, communication protocols, role assignment), perhaps the largest remaining obstacle to widespread SN deployment is that those who wish to deploy, utilize, and maintain a programmable Sensor Network lack the programming and systems expertise to do so. The contributions of this thesis centers on the design, development and deployment of the SN Workbench (snBench). snBench embodies an accessible, modular programming platform coupled with a flexible and extensible run-time system that, together, support the entire life-cycle of distributed sensory services. As it is impossible to find a one-size-fits-all programming interface, this work advocates the use of tiered layers of abstraction that enable a variety of high-level, domain specific languages to be compiled to a common (thin-waist) tasking language; this common tasking language is statically verified and can be subsequently re-translated, if needed, for execution on a wide variety of hardware platforms. snBench provides: (1) a common sensory tasking language (Instruction Set Architecture) powerful enough to express complex SN services, yet simple enough to be executed by highly constrained resources with soft, real-time constraints, (2) a prototype high-level language (and corresponding compiler) to illustrate the utility of the common tasking language and the tiered programming approach in this domain, (3) an execution environment and a run-time support infrastructure that abstract a collection of heterogeneous resources into a single virtual Sensor Network, tasked via this common tasking language, and (4) novel formal methods (i.e., static analysis techniques) that verify safety properties and infer implicit resource constraints to facilitate resource allocation for new services. This thesis presents these components in detail, as well as two specific case-studies: the use of snBench to integrate physical and wireless network security, and the use of snBench as the foundation for semester-long student projects in a graduate-level Software Engineering course.