873 resultados para scholarly text editing
Resumo:
This introduction lays out the scholarly and methodological context where to situate the contributions to this special issue. By combining a rigorous scrutiny of hitherto untapped archival sources with a re-examined application of Pierre Bourdieu’s sociology of culture within the field of periodical studies and publishing history in Italy (1940s-1950s), the studies illuminate the complex ways in which journals, periodical editors, and the connected publishing houses negotiate cultural practice in a literary field increasingly dominated by the polarization of political discourse.
Resumo:
In 1659-60, James Harrington and Henry Stubbe, two republican authors, engaged in a bad-tempered pamphlet debate about the constitution of classical Sparta. This took place in the context of political collapse after the fall of the Cromwellian Protectorate, as republicans desperately attempted to devise safeguards which could prevent the return of monarchy. Questions of constitutional form were not always at the forefront of 1650s English republicanism, but Harrington’s ideal constitution of ‘Oceana’ brought these questions to the fore in 1659’s discussions. Sparta formed a key plank of the ‘ancient prudence’ which supported Harrington’s theory, and like Stubbe he drew on Nicolaus Cragius’ De Republica Lacedaemoniorum (1593) for evidence, and was attracted to some of the more apparently ‘aristocratic’ elements of the Spartan constitution. However, classical texts and modern scholarly authority, such as Cragius’, were not the only ingredients in the English version of the ‘classical republican’ tradition; sixteenth- and seventeenth-century political thinkers and current exigencies also shaped Harrington and Stubbe’s arguments. Both Harrington and Stubbe ended up challenging the scholarly and ancient consensus that Sparta was an aristocracy or mixed polity, Harrington reinterpreting it to assimilate it to ‘democracy’, and Stubbe attempting to rehabilitate a model of benign ‘oligarchy’.
Resumo:
After the war Italian artists and intellectuals saw a significant and necessary confluence between their political desire to create a "new." Italy and their cultural ambition to re-invigorate the study of medieval Italy. This tendency is particularly evident, I argue, in the post-war scholarly and critical focus on Boccaccio, and especially Boccaccio’s Decameron. Not only within the academy but also in the popular press, Boccaccio was granted pride of place in the canon, venerated as the pioneer of socially conscious vernacular literary realism, the archetype for the pursuit of artistic truth in the face of social upheaval. As a result, I wish to suggest, Italian neorealism, which rose to prominence in the first years after the Second World War, was in a significant sense imbued with and realised through a profound engagement with the work of Boccaccio. In turn, the cultural currents affiliated with neorealism influenced Boccaccio studies, whose operative notions of medieval «realism» were to a perhaps surprising degree stimulated by approaches to the neo-realist poetics at work in the Italian films, novels, and criticism of the 1940s and ’50s. Situating the critical discourse surrounding Boccaccio within the post-war Italian context can therefore serve to shed unexpected light on both the cultural affirmation of neorealism and the disciplinary configuration of Italian medieval studies.
Resumo:
The reputation of The Phantom Carriage (Körkarlen) as one of the major films of Swedish silent cinema is in some respects securely established. Yet the film has attracted surprisingly little detailed discussion. It may be that its most striking stylistic features have deflected or discouraged closer scrutiny. Tom Gunning, for instance, in making the case for Sjöström’s Masterman, argues that ‘Körkarlen wears its technique on its sleeve, overtly displays its unquestionable mastery of superimposition and complex narrative structure. Mästerman tucks its mastery of editing and composition up its sleeve, so to speak’. This article makes an argument for a different evaluation of The Phantom Carriage, bringing a critical and interpretative understanding of the film’s style into conversation with the historical accounts of film form which predominate in the scholarship around silent cinema. It suggests that the film achieves ‘mastery of editing and composition’ with a flexibility and fluidity in the construction of dramatic space that is in itself remarkable for its period, but that Sjöström’s achievements extend well beyond his handling of film space. Specifically, it discusses a segment which is in several respects at the heart of the film: the first meeting between the two central characters, David Holm (Victor Sjöström) and Sister Edit (Astrid Holm); it spans the film’s exact mid-point; and at almost twelve and a half minutes it is the longest uninterrupted passage to take place in a single setting. The chapter argues that the dramatic and structural centrality of the hostel segment is paralleled by its remarkably rich articulation of the relationships between action, character and space. We show how Sjöström’s creation of a three-dimensional filmic space - with no hint of frontality - becomes the basis for a reciprocal relationship between spatial naturalism and performance style, and for a mise-en-scene that can take on discrete interpretive force. The argument also places the hostel sequences within the film as a whole in order to show how relationships articulated through the detailed decisions in this section take on their full resonance within patterns and motifs that develop across the film.
Resumo:
This chapter re-evaluates the diachronic, evolutionist model that establishes the Second World War as a watershed between classical and modern cinemas, and ‘modernity’ as the political project of ‘slow cinema’. I will start by historicising the connection between cinematic speed and modernity, going on to survey the veritable obsession with the modern that continues to beset film studies despite the vagueness and contradictions inherent in the term. I will then attempt to clarify what is really at stake within the modern-classical debate by analysing two canonical examples of Japanese cinema, drawn from the geidomono genre (films on the lives of theatre actors), Kenji Mizoguchi’s Story of the Late Chrysanthemums (Zangiku monogatari, 1939) and Yasujiro Ozu’s Floating Weeds (Ukigusa, 1954), with a view to investigating the role of the long take or, conversely, classical editing, in the production or otherwise of a supposed ‘slow modernity’. By resorting to Ozu and Mizoguchi, I hope to demonstrate that the best narrative films in the world have always combined a ‘classical’ quest for perfection with the ‘modern’ doubt of its existence, hence the futility of classifying cinema in general according to an evolutionary and Eurocentric model based on the classical-modern binary. Rather than on a confusing politics of the modern, I will draw on Bazin’s prophetic insight of ‘impure cinema’, a concept he forged in defence of literary and theatrical screen adaptations. Anticipating by more than half a century the media convergence on which the near totality of our audiovisual experience is currently based, ‘impure cinema’ will give me the opportunity to focus on the confluence of film and theatre in these Mizoguchi and Ozu films as the site of a productive crisis where established genres dissolve into self-reflexive stasis, ambiguity of expression and the revelation of the reality of the film medium, all of which, I argue, are more reliable indicators of a film’s political programme than historical teleology. At the end of the journey, some answers may emerge to whether the combination of the long take and the long shot are sufficient to account for a film’s ‘slowness’ and whether ‘slow’ is indeed the best concept to signify resistance to the destructive pace of capitalism.
Resumo:
Philosophy has repeatedly denied cinema in order to grant it artistic status. Adorno, for example, defined an ‘uncinematic’ element in the negation of movement in modern cinema, ‘which constitutes its artistic character’. Similarly, Lyotard defended an ‘acinema’, which rather than selecting and excluding movements through editing, accepts what is ‘fortuitous, dirty, confused, unclear, poorly framed, overexposed’. In his Handbook of Inaesthetics, Badiou embraces a similar idea, by describing cinema as an ‘impure circulation’ that incorporates the other arts. Resonating with Bazin and his defence of ‘impure cinema’, that is, of cinema’s interbreeding with other arts, Badiou seems to agree with him also in identifying the uncinematic as the location of the Real. This article will investigate the particular impurities of cinema that drive it beyond the specificities of the medium and into the realm of the other arts and the reality of life itself. Privileged examples will be drawn from various moments in film history and geography, starting with the analysis of two films by Jafar Panahi: This Is Not a Film (In film nist, 2011), whose anti-cinema stance in announced in its own title; and The Mirror (Aineh, 1997), another relentless exercise in self-negation. It goes on to examine Kenji Mizoguchi’s deconstruction of cinematic acting in his exploration of the geidomono genre (films about theatre actors) in The Story of the Last Chrysanthemums (Zangigku monogatari, 1939), and culminates in the conjuring of the physical experience of death through the systematic demolition of film genres in The Act of Killing (Joshua Oppenheimer et al., 2012).
Resumo:
The emergence and development of digital imaging technologies and their impact on mainstream filmmaking is perhaps the most familiar special effects narrative associated with the years 1981-1999. This is in part because some of the questions raised by the rise of the digital still concern us now, but also because key milestone films showcasing advancements in digital imaging technologies appear in this period, including Tron (1982) and its computer generated image elements, the digital morphing in The Abyss (1989) and Terminator 2: Judgment Day (1991), computer animation in Jurassic Park (1993) and Toy Story (1995), digital extras in Titanic (1997), and ‘bullet time’ in The Matrix (1999). As a result it is tempting to characterize 1981-1999 as a ‘transitional period’ in which digital imaging processes grow in prominence and technical sophistication, and what we might call ‘analogue’ special effects processes correspondingly become less common. But such a narrative risks eliding the other practices that also shape effects sequences in this period. Indeed, the 1980s and 1990s are striking for the diverse range of effects practices in evidence in both big budget films and lower budget productions, and for the extent to which analogue practices persist independently of or alongside digital effects work in a range of production and genre contexts. The chapter seeks to document and celebrate this diversity and plurality, this sustaining of earlier traditions of effects practice alongside newer processes, this experimentation with materials and technologies old and new in the service of aesthetic aspirations alongside budgetary and technical constraints. The common characterization of the period as a series of rapid transformations in production workflows, practices and technologies will be interrogated in relation to the persistence of certain key figures as Douglas Trumbull, John Dykstra, and James Cameron, but also through a consideration of the contexts for and influences on creative decision-making. Comparative analyses of the processes used to articulate bodies, space and scale in effects sequences drawn from different generic sites of special effects work, including science fiction, fantasy, and horror, will provide a further frame for the chapter’s mapping of the commonalities and specificities, continuities and variations in effects practices across the period. In the process, the chapter seeks to reclaim analogue processes’ contribution both to moments of explicit spectacle, and to diegetic verisimilitude, in the decades most often associated with the digital’s ‘arrival’.
Resumo:
The sixteenth-century Shebet Yehudah is an account of the persecutions of Jews in various countries and epochs, including their expulsion from Spain in the fifteenth century. It is not a medieval text and was written long after many of the events it describes. Yet although it cannot give us a contemporary medieval standpoint, it provides important insights into how later Jewish writers perceived Jewish–papal relations in the thirteenth, fourteenth, and fifteenth centuries. Although the extent to which Jewish communities came into contact either with the papacy as an institution or the actions of individual popes varied immensely, it is through analysis of Hebrew works such as the Shebet Yehudah that we are able to piece together a certain understanding of Jewish ideas about the medieval papacy as an institution and the policies of individual popes. This article argues that Jews knew only too well that papal protection was not unlimited, but always carefully circumscribed in accordance with Christian theology. It is hoped that it will be a scholarly contribution to our growing understanding of Jewish ideas about the papacy's spiritual and temporal power and authority in the Later Middle Ages and how this impacted on Jewish communities throughout medieval Europe.
Resumo:
The purported migrations that have formed the peoples of Britain have been the focus of generations of scholarly controversy. However, this has not benefited from direct analyses of ancient genomes. Here we report nine ancient genomes (~1 x) of individuals from northern Britain: seven from a Roman era York cemetery, bookended by earlier Iron-Age and later Anglo-Saxon burials. Six of the Roman genomes show affinity with modern British Celtic populations, particularly Welsh, but significantly diverge from populations from Yorkshire and other eastern English samples. They also show similarity with the earlier Iron-Age genome, suggesting population continuity, but differ from the later Anglo-Saxon genome. This pattern concords with profound impact of migrations in the Anglo-Saxon period. Strikingly, one Roman skeleton shows a clear signal of exogenous origin, with affinities pointing towards the Middle East, confirming the cosmopolitan character of the Empire, even at its northernmost fringes.
Resumo:
Background: Massive open online courses (MOOCs) have become commonplace in the e-learning landscape. Thousands of elderly learners are participating in courses offered by various institutions on a multitude of platforms in many different languages. However, there is very little research into understanding elderly learners in MOOCs. Objective: We aim to show that a considerable proportion of elderly learners are participating in MOOCs and that there is a lack of research in this area. We hope this assertion of the wide gap in research on elderly learners in MOOCs will pave the way for more research in this area. Methods: Pre-course survey data for 10 University of Reading courses on the FutureLearn platform were analyzed to show the level of participation of elderly learners in MOOCs. Two MOOC aggregator sites (Class Central and MOOC List) were consulted to gather data on MOOC offerings that include topics relating to aging. In parallel, a selected set of MOOC platform catalogues, along with a recently published review on health and medicine-related MOOCs, were searched to find courses relating to aging. A systematic literature search was then employed to identify research articles on elderly learners in MOOCs. Results: The 10 courses reviewed had a considerable proportion of elderly learners participating in them. For the over-66 age group, this varied from 0.5% (on the course “Managing people”) to 16.3% (on the course “Our changing climate”), while for the over-56 age group it ranged from 3.0% (on “A beginners guide to writing in English”) to 39.5% (on “Heart health”). Only six MOOCs were found to include topics related to aging: three were on the Coursera platform, two on the FutureLearn platform, and one on the Open2Study platform. Just three scholarly articles relating to MOOCs and elderly learners were retrieved from the literature search. Conclusions: This review presents evidence to suggest that elderly learners are already participating in MOOCs. Despite this, there has been very little research into their engagement with MOOCs. Similarly, there has been little research into exploiting the scope of MOOCs for delivering topics that would be of interest to elderly learners. We believe there is potential to use MOOCs as a way of tackling the issue of loneliness among older adults by engaging them as either resource personnel or learners.
Resumo:
Ιn the eighteenth century the printing of Greek texts continued to be central to scholarship and discourse. The typography of Greek texts could be characterised as a continuation of French models from the sixteenth century, with a gradual dilution of the complexity of ligatures and abbreviations, mostly through printers in the Low Countries. In Britain, Greek printing was dominated by the university presses, which reproduced conservatively the continental models – exemplified by Oxford's Fell types, which were Dutch adaptations of earlier French models. Hindsight allows us to identify a meaningful development in the Greek types cut by Alexander Wilson for the Foulis Press in Glasgow, but we can argue that in the middle of the eighteenth century Baskerville was considering Greek printing the typographic environment was ripe for a new style of Greek types. The opportunity to cut the types for a New Testament (in an twin edition that included a generous octavo and a large quarto version) would seem perfect for showcasing Baskerville's capacity for innovation. His Greek type maintained the cursive ductus of earlier models, but abandoned complex ligatures and any hint of scribal flourish. He homogenised the modulation of the letter strokes and the treatment of terminals, and normalised the horizontal alignments of all letters. Although the strokes are in some letters too delicate, the narrow set of the style composes a consistent, uniform texture that is a clean break from contemporaneous models. The argument is made that this is the first Greek typeface that can be described as fully typographic in the context of the technology of the time. It sets a pattern that was to be followed, without acknowledgement, by Richard Porson nearly a century and a half later. The typeface received little praise by typographic historians, and was condemned by Victor Scholderer in his retrospective of Greek typography. A survey of typeface reviews in the surrounding decades establishes that the commentators were mostly reproducing the views of an arbitrary typographic orthodoxy, for which only types with direct references to Renaissance models were acceptable. In these comments we detect a bias against someone considered an arriviste in the scholarly printing establishment, as well as a conservative attitude to typographic innovation.
Resumo:
Contemporary US sitcom is at an interesting crossroads: it has received an increasing amount of scholarly attention (e.g. Mills 2009; Butler 2010; Newman and Levine 2012; Vermeulen and Whitfield 2013), which largely understands it as shifting towards the aesthetically and narratively complex. At the same time, in the post-broadcasting era, US networks are particularly struggling for their audience share. With the days of blockbuster successes like Must See TV’s Friends (NBC 1994-2004) a distant dream, recent US sitcoms are instead turning towards smaller, engaged audiences. Here, a cult sensibility of intertextual in-jokes, temporal and narrational experimentation (e.g. flashbacks and alternate realities) and self-reflexive performance styles have marked shows including Community (NBC 2009-2015), How I Met Your Mother (CBS 2005-2014), New Girl (Fox 2011-present) and 30 Rock (NBC 2006-2013). However, not much critical attention has so far been paid to how these developments in textual sensibility in contemporary US sitcom may be influenced by, and influencing, the use of transmedia storytelling practices, an increasingly significant industrial concern and rising scholarly field of enquiry (e.g. Jenkins 2006; Mittell 2015; Richards 2010; Scott 2010; Jenkins, Ford and Green 2013). This chapter investigates this mutual influence between sitcom and transmedia by taking as its case studies two network shows that encourage invested viewership through their use of transtexts, namely How I Met Your Mother (hereafter HIMHM) and New Girl (hereafter NG). As such, it will pay particular attention to the most transtextually visible character/actor from each show: HIMYM’s Barney Stinson, played by Neil Patrick Harris, and NG’s Schmidt, played by Max Greenfield. This chapter argues that these sitcoms do not simply have their particular textual sensibility and also (happen to) engage with transmedia practices, but that the two are mutually informing and defining. This chapter explores the relationships and interplay between sitcom aesthetics, narratives and transmedia storytelling (or industrial transtexts), focusing on the use of multiple delivery channels in order to disperse “integral elements of a fiction” (Jenkins, 2006 95-6), by official entities such as the broadcasting channels. The chapter pays due attention to the specific production contexts of both shows and how these inform their approaches to transtexts. This chapter’s conceptual framework will be particularly concerned with how issues of texture, the reality envelope and accepted imaginative realism, as well as performance and the actor’s input inform and illuminate contemporary sitcoms and transtexts, and will be the first scholarly research to do so. It will seek out points of connections between two (thus far) separate strands of scholarship and will move discussions on transtexts beyond the usual genre studied (i.e. science-fiction and fantasy), as well as make a contribution to the growing scholarship on contemporary sitcom by approaching it from a new critical angle. On the basis that transmedia scholarship stands to benefit from widening its customary genre choice (i.e. telefantasy) for its case studies and from making more use of in-depth close analysis in its engagement with transtexts, the chapter argues that notions of texture, accepted imaginative realism and the reality envelope, as well as performance and the actor’s input deserve to be paid more attention to within transtext-related scholarship.
Resumo:
This chapter examines the importance of legitimacy for international organizations, and their efforts to legitimate themselves vis-à-vis different audiences. Legitimacy, which for decades barely featured in the scholarly analysis of international organizations, has since the late 1990s been an increasingly important lens through which the processes, practices, and structures of international organizations have been examined. The chapter makes three main arguments. First, it argues that in most international organizations the most important actors engaging in legitimation efforts are not the supranational bureaucracies, but member states. This has important implications for our understanding of the purposes of seeking legitimacy, and for the possible practices. Second, legitimacy and legitimation serve a range of purposes for these states, beyond achieving greater compliance with their decisions, which has been one of the key functional logics highlighted for legitimacy in the literature. Instead, legitimacy is frequently sought to exclude outsiders from the functional or territorial domains affected by an international organization’s authority, or to maintain external material and political support for existing arrangements. Third, one of the most prominent legitimation efforts, institutional reforms, often prioritizes form over function, signalling to important and powerful audiences to encourage their continued material and political support. To advance these arguments, the chapter is divided into four sections. The first develops the concept of legitimacy and its application to international organizations, and then asks why their legitimacy has become such an important intellectual and political concern in recent years. The second part will look in more detail at the legitimation practices of international organizations, focusing on who engages in these practices, who the key audiences are, and how legitimation claims are advanced. The third section will look in more detail at one of the most common forms of legitimation – institutional reform – through the lens of two such reforms in international organizations: efforts towards greater interoperability in NATO, and the establishment of the African Peace and Security Architecture in the African Union (AU). The chapter will conclude with some reflections of the contribution that a legitimacy perspective has made to our understanding of the practices of international organizations.
Resumo:
This paper explores the relationship between discourse and action in practices involved in making and consuming texts. Texts are produced through the process of ‘entextualization’ in which strips of action and discourse are extracted from their original contexts and recontextualized into other situations. Different technologies for turning actions into texts affect the kinds of social actions and social identities that are made possible both at moments of entextualization and at future moments of recontextualization. In particular, I focus on how digital technologies affect the practices and participation structures around entextualization. Digital photography and video have had a profound effect on social practices and relationships around the making of texts. Specifically, they have made processes of entextualization more immediate, more contingent and more communal. Implications of these features of digital text making are discussed in light of previous work on literacy and orality.