6 resultados para medium of instruction

em Greenwich Academic Literature Archive - UK


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Processing Instruction (PI) is an approach to grammar instruction for second language learning. It derives its name from the fact that the instruction (both the explicit explanation as well as the practices) attempt to influence, alter, and/or improve the way learners process input. PI contrasts with traditional grammar instruction in many ways, most principally in its focus on input whereas traditional grammar instruction focuses on learners' output. The greatest contribution of PI to both theory and practice is the concept of "structured input", a form of comprehensible input that has been manipulated to maximize learners' benefit of exposure to input. This volume focuses on a new issue for PI, the role of technology in language learning. It examines empirically the differential effects of delivering PI in classrooms with an instructor and students interacting (with each other and with the instructor) versus on computers to students working individually. It also contributes to the growing body of research on the effects of PI on different languages as well as different linguistic items: preterite/imperfect aspectual contrast and negative informal commands in Spanish, the subjunctive of doubt and opinion in Italian, and the subjunctive of doubt in French. Further research contributions are made by comparing PI with other types of instruction, specifically, with meaning-oriented output instruction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kurzel(2004) points out that researchers in e-learning and educational technologists, in a quest to provide improved Learning Environments (LE) for students are focusing on personalising the experience through a Learning Management System (LMS) that attempts to tailor the LE to the individual (see amongst others Eklund & Brusilovsky, 1998; Kurzel, Slay, & Hagenus, 2003; Martinez,2000; Sampson, Karagiannidis, & Kinshuk, 2002; Voigt & Swatman; 2003). According to Kurzel (2004) this tailoring can have an impact on content and how it’s accessed; the media forms used; method of instruction employed and the learning styles supported. This project is aiming to move personalisation forward to the next generation, by tackling the issue of Personalised e-Learning platforms as pre-requisites for building and generating individualised learning solutions. The proposed development is to create an e-learning platform with personalisation built-in. This personalisation is proposed to be set from different levels of within the system starting from being guided by the information that the user inputs into the system down to the lower level of being set using information inferred by the system’s processing engine. This paper will discuss some of our early work and ideas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With emergence of "Semantic Web" there has been much discussion about the impact of technologies such as XML and RDF on the way we use the Web for developing e-learning applications and perhaps more importantly on how we can personalise these applications. Personalisation of e-learning is viewed by many authors (see amongst others Eklund & Brusilovsky, 1998; Kurzel, Slay, & Hagenus, 2003; Martinez, 2000; Sampson, Karagiannidis, & Kinshuk, 2002; Voigt & Swatman, 2003) as the key challenge for the learning technologists. According to Kurzel (2004) the tailoring of e-learning applications can have an impact on content and how it's accesses; the media forms used; method of instruction employed and the learning styles supported. This paper will report on a research project currently underway at the eCentre in University of Greenwich which is exploring different approaches and methodologies to create an e-learning platform with personalisation built-in. This personalisation is proposed to be set from different levels of within the system starting from being guided by the information that the user inputs into the system down to the lower level of being set using information inferred by the system's processing engine.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Britain since the 1960s television has been the most influential medium of popular culture. Television is also the site where the Western Front of popular culture clashes with the Western Front of history. This book examines the ways in which those involved in the production of historical documentaries for this most influential media have struggled to communicate the stories of the First World War to British audiences. Documents in the BBC Written Archives Centre at Caversham, Berkshire, the Imperial War Museum, and the Liddell Hart Centre for Military Archives all inform the analysis. Interviews and correspondence with television producers, scriptwriters and production crew, as well as two First World War veterans who appeared in several recent documentaries provide new insights for the reader. Emma Hanna takes the reader behind the scenes of the making of the most influential documentaries from the landmark epic series The Great War (BBC, 1964) up to more recent controversial productions such as The Trench (BBC, 2002) and Not Forgotten: The Men Who Wouldn't Fight (BBC, 2008). By examining the production, broadcast and reception of a number of British television documentaries this book examines the difficult relationship between the war's history and its popular memory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The miniaturization and dissemination of audiovisual media into small, mobile assemblages of cameras, screens and microphones has brought "database cinema" (Manovich) into pockets and handbags. In turn, this micro-portability of video production calls for a reconsideration of database cinema, not as an aesthetic but rather as a media ecology that makes certain experiences and forms of interaction possible. In this context the clip and the fragment become a social currency (showing, trading online, etc.), and the enjoyment of a moment or "occasion" becomes an opportunity for recording, extending, preserving and displaying. If we are now the documentarists of our lives (as so many mobile phone adverts imply), it follows that we are also our own archivists as well. From the folksonomies of Flickr and YouTube to the slick "media centres" of Sony, Apple and Microsoft, the audiovisual home archive is a prized territory of struggle among platforms and brands. The database is emerging as the dominant (screen) medium of popular creativity and distribution – but it also brings the categories of "home" and "person" closer to that of the archive.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development ­Environments (DECADE). A brief discussion sets the background for IoT, and the development of the ­distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, ­local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and ­quantitative ­analysis ­carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service ­architecture, ­combining a distributed data warehouse, web services for analysis agents, ontology agents and a ­verification engine, with a centrally verified outcome database maintained by certifying body for qualification/­professional status.