746 resultados para Syntax


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the Unified Form Language (UFL), which is a domain-specific language for representing weak formulations of partial differential equations with a view to numerical approximation. Features of UFL include support for variational forms and functionals, automatic differentiation of forms and expressions, arbitrary function space hierarchies formultifield problems, general differential operators and flexible tensor algebra. With these features, UFL has been used to effortlessly express finite element methods for complex systems of partial differential equations in near-mathematical notation, resulting in compact, intuitive and readable programs. We present in this work the language and its construction. An implementation of UFL is freely available as an open-source software library. The library generates abstract syntax tree representations of variational problems, which are used by other software libraries to generate concrete low-level implementations. Some application examples are presented and libraries that support UFL are highlighted. © 2014 ACM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a new formally syntax-based method for statistical machine translation. Transductions between parsing trees are transformed into a problem of sequence tagging, which is then tackled by a search- based structured prediction method. This allows us to automatically acquire transla- tion knowledge from a parallel corpus without the need of complex linguistic parsing. This method can achieve compa- rable results with phrase-based method (like Pharaoh), however, only about ten percent number of translation table is used. Experiments show that the structured pre- diction approach for SMT is promising for its strong ability at combining words.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

提出了一种句法语义一体化的语言分析方法,句法分析和语义理解时采用并行方法,利用两者之间的相互关系实现句法和语义的分析。针对自然语言理解在几何特定领域的约束性,以依存语法为基础,利用标注过的语料库知识,采用规则统计模型,对已经标注好词性语义的句子词串进行句法语义一体化分析,生成符合数学规范的数学表达式。实验证明,建立的系统对100个几何描述的句子进行测试,得到的正确率为98%,在几何领域具有良好的实用性,能够满足实际的需要。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

在低挡微机中速度较慢的串行处理硬设备条件下,利用本文提出的启发式概念,分层搜索和匹配策略以及设置最大搜索长度等方法,可使推理速度提高一个数量级以上.此外,通过引入语义信息,分阶段消除歧义,自顶向下与自底向上相结合,以及把一般疑问句一律变成相应陈述句的方法,解决了自动英语句法分析中的一系列难题,缩小了知识库的规模。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

本文简要介绍了一个数控自动编程专家系统的自然语言接口的实现.该自然语言接口是以我们研制的数控自动编程专家系统为背景,运行在 SUN3/4 工作站的 UNIX 下和 IBM/AT 机的 DOS 下,用 C语言编程.该自然语言接口由词法分析、句法分析、语义语用分析、目标生成和图形仿真五个模块及相应的知识库构成.该接口能够接受数控编程系统所需的对工件的英语自然语言描述并处理一些比较简单的英语语言现象.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

提出了IDEF与UML两者结合的系统建模方法,能够有效地避免IDEF对信息过程流建模和UML语义描述精确性及使用灵活性的不足.该方法采用IDEF0进行系统功能建模,用IDEF1x和UML模型进行信息建模和面向对象的软件系统设计.用该方法设计的沈阳某冶金设备有限公司的基于组件的生产过程管理系统模型具有良好的维护性、可扩展性和重用性,证明了该方法的可行性.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article, we begin from the research about how money influences the behavior of people, and try to discuss how money influences people's moral judgment on the five degrees of morality (harm/care, fairness/reciprocal, loyalty/in-group, authority/respect, purity/sanctity). Meanwhile, we try to discuss whether the money priming is based on the mechanism of competition priming. Besides that, we want to find out whether moral identification and positive vocabulary could rescue the change of moral judgment after money priming. The money priming in this research is based on picture priming and syntax priming; the competition priming and moral identification priming are based on imagination priming. We chose the undergraduate, graduate student and adult as sample, combined the scale investigation with computer based experiment. This research contains five standard case experiments, which form three. Based on all the research above, we have some conclusions: 1. Money priming has impacts on moral judgment, which are not consistent on different degree of morality. But the total effect of money priming is that it changes moral judgment to a worse state. 2. Money priming is not complete competition priming, but social value orientation (including competition orientation) could mediate the influence of money priming on moral judgment. Generally, people with personal orientation or competition orientation could be influenced more easily by money priming. 3. After money priming, the moral judgment could be influence by moral identification and positive vocabulary. In all, both the moral judgment and positive vocabulary could make the moral judgment to a better state. But the function of them may be different, moral identification is more related to moral cognition and positive vocabulary is more related to moral emotional regulation. This research is based on priming method, and supports the money influence on psychology, the concept of morality and moral identification with experimental evidence. Also this research discusses the measurement of morality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research investigates the acoustic-phonetic correlates of various levels of syntactic boundaries and the perception of prosody in Mandarin Chinese, more specifically, the way speakers express the syntatic relations between sentence compounents and teh perceptual representations of prosody. The relation between phonology and syntax in Chinese language is studied by comparing the perceptual representations and syntactic structures of sentences. The results may have theoretical and practical implications for research in fields of speech perception, linguistics and psycholinguistics, and for the development of speech engineering in China.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces Denotational Proof Languages (DPLs). DPLs are languages for presenting, discovering, and checking formal proofs. In particular, in this paper we discus type-alpha DPLs---a simple class of DPLs for which termination is guaranteed and proof checking can be performed in time linear in the size of the proof. Type-alpha DPLs allow for lucid proof presentation and for efficient proof checking, but not for proof search. Type-omega DPLs allow for search as well as simple presentation and checking, but termination is no longer guaranteed and proof checking may diverge. We do not study type-omega DPLs here. We start by listing some common characteristics of DPLs. We then illustrate with a particularly simple example: a toy type-alpha DPL called PAR, for deducing parities. We present the abstract syntax of PAR, followed by two different kinds of formal semantics: evaluation and denotational. We then relate the two semantics and show how proof checking becomes tantamount to evaluation. We proceed to develop the proof theory of PAR, formulating and studying certain key notions such as observational equivalence that pervade all DPLs. We then present NDL, a type-alpha DPL for classical zero-order natural deduction. Our presentation of NDL mirrors that of PAR, showing how every basic concept that was introduced in PAR resurfaces in NDL. We present sample proofs of several well-known tautologies of propositional logic that demonstrate our thesis that DPL proofs are readable, writable, and concise. Next we contrast DPLs to typed logics based on the Curry-Howard isomorphism, and discuss the distinction between pure and augmented DPLs. Finally we consider the issue of implementing DPLs, presenting an implementation of PAR in SML and one in Athena, and end with some concluding remarks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Type-omega DPLs (Denotational Proof Languages) are languages for proof presentation and search that offer strong soundness guarantees. LCF-type systems such as HOL offer similar guarantees, but their soundness relies heavily on static type systems. By contrast, DPLs ensure soundness dynamically, through their evaluation semantics; no type system is necessary. This is possible owing to a novel two-tier syntax that separates deductions from computations, and to the abstraction of assumption bases, which is factored into the semantics of the language and allows for sound evaluation. Every type-omega DPL properly contains a type-alpha DPL, which can be used to present proofs in a lucid and detailed form, exclusively in terms of primitive inference rules. Derived inference rules are expressed as user-defined methods, which are "proof recipes" that take arguments and dynamically perform appropriate deductions. Methods arise naturally via parametric abstraction over type-alpha proofs. In that light, the evaluation of a method call can be viewed as a computation that carries out a type-alpha deduction. The type-alpha proof "unwound" by such a method call is called the "certificate" of the call. Certificates can be checked by exceptionally simple type-alpha interpreters, and thus they are useful whenever we wish to minimize our trusted base. Methods are statically closed over lexical environments, but dynamically scoped over assumption bases. They can take other methods as arguments, they can iterate, and they can branch conditionally. These capabilities, in tandem with the bifurcated syntax of type-omega DPLs and their dynamic assumption-base semantics, allow the user to define methods in a style that is disciplined enough to ensure soundness yet fluid enough to permit succinct and perspicuous expression of arbitrarily sophisticated derived inference rules. We demonstrate every major feature of type-omega DPLs by defining and studying NDL-omega, a higher-order, lexically scoped, call-by-value type-omega DPL for classical zero-order natural deduction---a simple choice that allows us to focus on type-omega syntax and semantics rather than on the subtleties of the underlying logic. We start by illustrating how type-alpha DPLs naturally lead to type-omega DPLs by way of abstraction; present the formal syntax and semantics of NDL-omega; prove several results about it, including soundness; give numerous examples of methods; point out connections to the lambda-phi calculus, a very general framework for type-omega DPLs; introduce a notion of computational and deductive cost; define several instrumented interpreters for computing such costs and for generating certificates; explore the use of type-omega DPLs as general programming languages; show that DPLs do not have to be type-less by formulating a static Hindley-Milner polymorphic type system for NDL-omega; discuss some idiosyncrasies of type-omega DPLs such as the potential divergence of proof checking; and compare type-omega DPLs to other approaches to proof presentation and discovery. Finally, a complete implementation of NDL-omega in SML-NJ is given for users who want to run the examples and experiment with the language.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a Communication Bootstrapping system, peer components with different perceptual worlds invent symbols and syntax based on correlations between their percepts. I propose that Communication Bootstrapping can also be used to acquire functional definitions of words and causal reasoning knowledge. I illustrate this point with several examples, then sketch the architecture of a system in progress which attempts to execute this task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How can one represent the meaning of English sentences in a formal logical notation such that the translation of English into this logical form is simple and general? This report answers this question for a particular kind of meaning, namely quantifier scope, and for a particular part of the translation, namely the syntactic influence on the translation. Rules are presented which predict, for example, that the sentence: Everyone in this room speaks at least two languages. has the quantifier scope AE in standard predicate calculus, while the sentence: At lease two languages are spoken by everyone in this room. has the quantifier scope EA. Three different logical forms are presented, and their translation rules are examined. One of the logical forms is predicate calculus. The translation rules for it were developed by Robert May (May 19 77). The other two logical forms are Skolem form and a simple computer programming language. The translation rules for these two logical forms are new. All three sets of translation rules are shown to be general, in the sense that the same rules express the constraints that syntax imposes on certain other linguistic phenomena. For example, the rules that constrain the translation into Skolem form are shown to constrain definite np anaphora as well. A large body of carefully collected data is presented, and used to assess the empirical accuracy of each of the theories. None of the three theories is vastly superior to the others. However, the report concludes by suggesting that a combination of the two newer theories would have the greatest generality and the highest empirical accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"The Structure and Interpretation of Computer Programs" is the entry-level subject in Computer Science at the Massachusetts Institute of Technology. It is required of all students at MIT who major in Electrical Engineering or in Computer Science, as one fourth of the "common core curriculum," which also includes two subjects on circuits and linear systems and a subject on the design of digital systems. We have been involved in the development of this subject since 1978, and we have taught this material in its present form since the fall of 1980 to approximately 600 students each year. Most of these students have had little or no prior formal training in computation, although most have played with computers a bit and a few have had extensive programming or hardware design experience. Our design of this introductory Computer Science subject reflects two major concerns. First we want to establish the idea that a computer language is not just a way of getting a computer to perform operations, but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Secondly, we believe that the essential material to be addressed by a subject at this level, is not the syntax of particular programming language constructs, nor clever algorithms for computing particular functions of efficiently, not even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper defines a structured methodology which is based on the foundational work of Al-Shaer et al. in [1] and that of Hamed and Al-Shaer in [2]. It defines a methodology for the declaration of policy field elements, through to the syntax, ontology and functional verification stages. In their works of [1] and [2] the authors concentrated on developing formal definitions of possible anomalies between rules in a network firewall rule set. Their work is considered as the foundation for further works on anomaly detection, including those of Fitzgerald et al. [3], Chen et al. [4], Hu et al. [5], among others. This paper extends this work by applying the methods to information sharing policies, and outlines the evaluation related to these.