2 resultados para lexical unit

em ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In dieser Studie werden strukturgeologische, metamorphe und geochronologische Daten benutzt, um eine Quantifizierung tektonischer Prozesse vorzunehmen, die für die Exhumierung der Kykladischen Blauschiefereinheit in der Ägäis und der Westtürkei verantwortlich waren. Bei den beiden tektonischen Prozessen handelt es sich um: (1) Abschiebungstektonik und (2) vertikale duktile Ausdünnung. Eine finite Verformungsanalyse an Proben der Kykladischen Blauschiefereinheit ermöglicht eine Abschätzung des Beitrags von vertikaler duktiler Ausdünnung an der gesamten Exhumierung. Kalkulationen mit einem eindimensionalen, numerischen Model zeigt, daß vertikale duktile Ausdünnung nur ca. 10% an der gesamten Exhumierung ausmacht. Kinematische, metamorphe und geochronologische Daten erklären die tektonische Natur und die Evolution eines extensionalen Störungssystems auf der Insel Ikaria in der östlichen Ägäis. Thermobarometrische Daten lassen erkennen, daß das Liegende des Störungssystems aus ca. 15 km Tiefe exhumiert wurde. Sowohl Apatit- und Zirkonspaltspurenalter als auch Apatit (U-Th)/He-Alter zeigen, daß sich das extensionale Störungssystem zwischen 11-3 Ma mit einer Geschwindigkeit von ca. 7-8 km/Ma bewegte. Spät-Miozäne Abschiebungen trugen zur Exhumierung der letzten ~5-15 km der Hochdruckgesteine bei. Ein Großteil der Exhumierung der Kykladischen Blauschiefereinheit muß vor dem Miozän stattgefunden haben. Dies wird durch einen Extrusionskeil erklärt, der ca. 30-35 km der Kykladischen Blauschiefereinheit in der Westtürkei exhumierte. 40Ar/39Ar und 87Rb/86Sr Datierungen an Myloniten des oberen Abschiebungskontakts zwischen der Selçuk Decke und der darunterliegenden Ampelos/Dilek Decke der Kykladischen Blauschiefereinheit als auch des unteren Überschiebungskontakts zwischen der Ampelos/Dilek Decke und den darunterliegenden Menderes Decken zeigt, daß sich beide mylonitische Zonen um ca. ~35 Ma formten, was die Existenz eines Spät-Eozänen/Früh-Oligozänen Extrusionskeils beweist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.