803 resultados para Codes and repertoires of language
Resumo:
In “The English Patient: English Grammar and teaching in the Twentieth Century”, Hudson and Walmsley (2005) contend that the decline of grammar in schools was linked to a similar decline in English universities, where no serious research or teaching on English grammar took place. This article argues that such a decline was due not only to a lack of research, but also because it suited educational policies of the time. It applies Bernstein’s theory of pedagogic discourse (1990 & 1996) to the case study of the debate surrounding the introduction of a national curriculum in English in England in the late 1980s and the National Literacy Strategy in the 1990s, to demonstrate the links between academic theory and educational policy.
Resumo:
Corpus Linguistics is a young discipline. The earliest work was done in the 1960s, but corpora only began to be widely used by lexicographers and linguists in the late 1980s, by language teachers in the late 1990s, and by language students only very recently. This course in corpus linguistics was held at the Departamento de Linguistica Aplicada, E.T.S.I. de Minas, Universidad Politecnica de Madrid from June 15-19 1998. About 45 teachers registered for the course. 30% had PhDs in linguistics, 20% in literature, and the rest were doctorandi or qualified English teachers. The course was designed to introduce the use of corpora and other computational resources in teaching and research, with special reference to scientific and technological discourse in English. Each participant had a computer networked with the lecturer’s machine, whose display could be projected onto a large screen. Application programs were loaded onto the central server, and telnet and a web browser were available. COBUILD gave us permission to access the 323 million word Bank of English corpus, Mike Scott allowed us to use his Wordsmith Tools software, and Tim Johns gave us a copy of his MicroConcord program.
Resumo:
In the introduction to the special issue “Languaging the worker: globalized governmentalities in/of language in peripheral spaces”, we take up the notion of governmentality as a means to interrogate the complex relationship between language, labor, power and subjectivity in peripheral multilingual spaces. Our aim here is to argue for the study of governmentality as a viable and growing approach in critical sociolinguistic research. As such, in this introduction, we first discuss key concepts germane to our interrogations, including the notions of governmentality, languaging, peripherality and language worker. We proceed to map out five ethnographically and discourse-analytically informed case studies. These examine diverse actors in different settings pertaining to the domain of work. Finally we chart how the case studies construe the issue of languaging the worker through a governmentality frame.
Resumo:
This paper introduces a method for the analysis of regional linguistic variation. The method identifies individual and common patterns of spatial clustering in a set of linguistic variables measured over a set of locations based on a combination of three statistical techniques: spatial autocorrelation, factor analysis, and cluster analysis. To demonstrate how to apply this method, it is used to analyze regional variation in the values of 40 continuously measured, high-frequency lexical alternation variables in a 26-million-word corpus of letters to the editor representing 206 cities from across the United States.
Resumo:
This thesis describes the design and engineering of a pressurised biomass gasification test facility. A detailed examination of the major elements within the plant has been undertaken in relation to specification of equipment, evaluation of options and final construction. The retrospective project assessment was developed from consideration of relevant literature and theoretical principles. The literature review includes a discussion on legislation and applicable design codes. From this analysis, each of the necessary equipment units was reviewed and important design decisions and procedures highlighted and explored. Particular emphasis was placed on examination of the stringent demands of the ASME VIII design codes. The inter-relationship of functional units was investigated and areas of deficiency, such as biomass feeders and gas cleaning, have been commented upon. Finally, plant costing was summarized in relation to the plant design and proposed experimental programme. The main conclusion drawn from the study is that pressurised gasification of biomass is far more difficult and expensive to support than atmospheric gasification. A number of recommendations have been made regarding future work in this area.
Resumo:
This work explores the relevance of semantic and linguistic description to translation, theory and practice. It is aimed towards a practical model of approach to texts to translate. As literary texts [poetry mainly] are the focus of attention, so are stylistic matters. Note, however, that 'style', and, to some extent, the conclusions of the work, are not limited to so-called literary texts. The study of semantic description reveals that most translation problems do not stem from the cognitive (langue-related), but rather from the contextual (parole-related) aspects of meaning. Thus, any linguistic model that fails to account for the latter is bound to fall short. T.G.G. does, whereas Systemics, concerned with both the 'Iangue' and 'parole' (stylistic and sociolinguistic mainly) aspects of meaning, provides a useful framework of approach to texts to translate. Two essential semantic principles for translation are: that meaning is the property of a language (Firth); and the 'relativity of meaning assignments' (Tymoczko). Both imply that meaning can only be assessed, correctly, in the relevant socio-cultural background. Translation is seen as a restricted creation, and the translator's encroach as a three-dimensional critical one. To encompass the most technical to the most literary text, and account for variations in emphasis in any text, translation theory must be based on typology of function Halliday's ideational, interpersonal and textual, or, Buhler's symbol, signal, symptom, Functions3. Function Coverall and specific] will dictate aims and method, and also provide the critic with criteria to assess translation Faithfulness. Translation can never be reduced to purely objective methods, however. Intuitive procedures intervene, in textual interpretation and analysis, in the choice of equivalents, and in the reception of a translation. Ultimately, translation, theory and practice, may perhaps constitute the touchstone as regards the validity of linguistic and semantic theories.
Resumo:
This study presents a detailed contrastive description of the textual functioning of connectives in English and Arabic. Particular emphasis is placed on the organisational force of connectives and their role in sustaining cohesion. The description is intended as a contribution for a better understanding of the variations in the dominant tendencies for text organisation in each language. The findings are expected to be utilised for pedagogical purposes, particularly in improving EFL teaching of writing at the undergraduate level. The study is based on an empirical investigation of the phenomenon of connectivity and, for optimal efficiency, employs computer-aided procedures, particularly those adopted in corpus linguistics, for investigatory purposes. One important methodological requirement is the establishment of two comparable and statistically adequate corpora, also the design of software and the use of existing packages and to achieve the basic analysis. Each corpus comprises ca 250,000 words of newspaper material sampled in accordance to a specific set of criteria and assembled in machine readable form prior to the computer-assisted analysis. A suite of programmes have been written in SPITBOL to accomplish a variety of analytical tasks, and in particular to perform a battery of measurements intended to quantify the textual functioning of connectives in each corpus. Concordances and some word lists are produced by using OCP. Results of these researches confirm the existence of fundamental differences in text organisation in Arabic in comparison to English. This manifests itself in the way textual operations of grouping and sequencing are performed and in the intensity of the textual role of connectives in imposing linearity and continuity and in maintaining overall stability. Furthermore, computation of connective functionality and range of operationality has identified fundamental differences in the way favourable choices for text organisation are made and implemented.
Resumo:
The thesis is concerned with cross-cultural distance learning in two countries: Great Britain and France. Taking the example of in-house sales training, it argues that it is possible to develop courses for use in two or more countries of differing culture and language. Two courses were developed by the researcher. Both were essentially print-based distance-learning courses designed to help salespeople achieve a better understanding of their customers. One used a quantitative, the other qualitative approach. One considered the concept of the return on investment and the other, for which a video support was also developed, considered the analysis of a customer's needs. Part 1 of the thesis considers differences in the training context between France and Britain followed by a review of the learning process with reference to distance learning. Part 2 looks at the choice of training medium course design and evaluation and sets out the methodology adopted, including problems encountered in this type of fieldwork. Part 3 analyses the data and draws conclusions from the findings, before offering a series of guidelines for those concerned with the development of cross-cultural in-house training courses. The results of the field tests on the two courses were analysed in relation to the socio-cultural, educational and experiential background of the learners as well as their preferred learning styles. The thesis argues that it is possible to develop effective in-house sales training courses to be used in two cultures and identifies key considerations which need to be taken into account when carrying out this type of work.
Resumo:
The aim of this thesis is to explore key aspects and problems of the institutionalised teaching and learning of German language and culture in the context of German Studies in British Higher Education (HE). This investigation focuses on teaching and learning experiences in one department of German Studies in the UK, which is the micro-context of the present study, in order to provide an in-depth insight into real-life problems, strengths and weaknesses as they occur in the practice of teaching and learning German. Following Lamb (2004) and Holliday (1994), the present study acts on the assumption that each micro-context does not exist in vacuo but is always embedded in a wider socio-political and education environment, namely the macro-context, which largely determines how and what is taught. The macro-analysis of the present study surveys the socio-political developments that have recently affected the sector of modern languages and specifically the discipline of German Studies in the UK. It demonstrates the impact they have had on teaching and learning German at the undergraduate level in Britain. This context is interesting inasmuch as the situation in Britain is to a large extent a paradigmatic example of the developments in German Studies in English-speaking countries. Subsequently, the present study explores learning experiences of a group of thirty-five first year students. It focuses on their previous experiences in learning German, exposure to the target language, motivation, learning strategies and difficulties encountered, when learning German at the tertiary level. Then, on the basis of interviews with five lecturers of German, teaching experience in the context under study is explored, problems and successful teaching strategies discussed.
Resumo:
This thesis attempts a psychological investigation of hemispheric functioning in developmental dyslexia. Previous work using neuropsychological methods with developmental dyslexics is reviewed ,and original work is presented both of a conventional psychometric nature and also utilising a new means of intervention. At the inception of inquiry into dyslexia, comparisons were drawn between developmental dyslexia and acquired alexia, promoting a model of brain damage as the common cause. Subsequent investigators found developmental dyslexics to be neurologically intact, and so an alternative hypothesis was offered, namely that language is abnormally localized (not in the left hemisphere). Research in the last decade, using the advanced techniques of modern neuropsychology, has indicated that developmental dyslexics are probably left hemisphere dominant for language. The development of a new type of pharmaceutical prep~ration (that appears to have a left hemisphere effect) offers an oppertunity to test the experimental hypothesis. This hypothesis propounds that most dyslexics are left hemisphere language dominant, but some of these language related operations are dysfunctioning. The methods utilised are those of psychological assessment of cognitive function, both in a traditional psychometric situation, and with a new form of intervention (Piracetam). The information resulting from intervention will be judged on its therapeutic validity and contribution to the understanding of hemispheric functioning in dyslexics. The experimental studies using conventional psychometric evaluation revealed a dyslexic profile of poor sequencing and name coding ability, with adequate spatial and verbal reasoning skills. Neuropsychological information would tend to suggest that this profile was indicative of adequate right hemsiphere abilities and deficits in some left hemsiphere abilities. When an intervention agent (Piracetam) was used with young adult dyslexics there were improvements in both the rate of acquisition and conservation of verbal learning. An experimental study with dyslexic children revealed that Piracetam appeared to improve reading, writing and sequencing, but did not influence spatial abilities. This would seem to concord with other recent findings, that deve~mental dyslexics may have left hemisphere language localisation, although some of these language related abilities are dysfunctioning.
A new role for Low German? Language insertion as bilingual practice in the process of language shift
Resumo:
This article analyses language insertion as a bilingual communicative practice, applying a functional, speaker-focused approach to the study of sociolinguistics and language contact. The article is based on a study of contact phenomena in a formerly diglossic region in Northern Germany, where the previously spoken language – Low German – is in the process of being replaced by the dominant standard variety, German. It examines regional publications in order to establish the linguistic techniques by which Low German elements are incorporated into the Standard German texts and the communicative purposes that they serve. The paper concludes that in the process of language shift an emblematic repertoire from Low German is created which can be applied into the dominant contact language, German, for specific communicative purposes.
Resumo:
Communication and portability are the two main problems facing the user. An operating system, called PORTOS, was developed to solve these problems for users on dedicated microcomputer systems. Firstly, an interface language was defined, according to the anticipated requirements and behaviour of its potential users. Secondly, the PORTOS operating system was developed as a processor for this language. The system is currently running on two minicomputers of highly different architectures. PORTOS achieves its portability through its high-level design, and implementation in CORAL66. The interface language consists of a set of user cotnmands and system responses. Although only a subset has been implemented, owing to time and manpower constraints, promising results were achieved regarding the usability of the language, and its portability.
Resumo:
Seminar discussion is an important mode of instruction in Higher Education. However, the discourse of discussion in academic seminars has been little investigated. Until now, there has existed only a limited amount of empirically based language description which could be used to inform those working in the field of English for Academic Purposes. The present study investigates discussion in seminars on a MBA programme and offers frameworks to account for central aspects of the verbal interaction: exchange patterns; acts and moves initiating exchanges and strategies. Three subgenres of seminar discussion are examined: the discussion following the presentation by an outside speaker; the discussion following the presentation by students and non-presentation tutorial discussion. Exchanges are found to be basically two-part structures of initiation and response. Some extended patterns are brought to light and it is argued that the major impetus prolonging exchanges in discussion is a third-part move registering dissatisfaction with the initial responses given. Exchanges are observed to be driven by moves functioning as elicitations although acts at initiation both ask for information and ideas and propose them. Initiation may be complex and involves a mixture of the acts. Textual signalling and attitudinal strategies used in seminars are explicated. The latter are accounted for in terms of the face concerns of the speakers. The features are examined across the three subgenres. Some quantitative variations were observed. These variations are discussed in the light of situational variables such as levels of participant status and knowledge. Theoretical implications are drawn and applications for syllabus and methodology in English for Academic Purposes are suggested.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.