48 resultados para Unified field theories
Resumo:
This thesis is a comparative case study in Japanese video game localization for the video games Sairen, Sairen 2 and Sairen Nyûtoransurêshon, and English-language localized versions of the same games as published in Scandinavia and Australia/New Zealand. All games are developed by Sony Computer Entertainment Inc. and published exclusively for Playstation2 and Playstation3 consoles. The fictional world of the Sairen games draws much influence from Japanese history, as well as from popular and contemporary culture, and in doing so caters mainly to a Japanese audience. For localization, i.e. the adaptation of a product to make it accessible to users outside the original market it was intended for in the first place, this is a challenging issue. Video games are media of entertainment, and therefore localization practice must preserve the games’ effects on the players’ emotions. Further, video games are digital products that are comprised of a multitude of distinct elements, some of which are part of the game world, while others regulate the connection between the player as part of the real world and the game as digital medium. As a result, video game localization is also a practice that has to cope with the technical restrictions that are inherent to the medium. The main theory used throughout the thesis is Anthony Pym’s framework for localization studies that considers the user of the localized product as a defining part of the localization process. This concept presupposes that localization is an adaptation that is performed to make a product better suited for use during a specific reception situation. Pym also addresses the factor that certain products may resist distribution into certain reception situations because of their content, and that certain aspects of localization aim to reduce this resistance through significant alterations of the original product. While Pym developed his ideas with mainly regular software in mind, they can also be adapted well to study video games from a localization angle. Since modern video games are highly complex entities that often switch between interactive and non-interactive modes, Pym’s ideas are adapted throughout the thesis to suit the particular elements being studied. Instances analyzed in this thesis include menu screens, video clips, in-game action and websites. The main research questions focus on how the games’ rules influence localization, and how the games’ fictional domain influences localization. Because there are so many peculiarities inherent to the medium of the video game, other theories are introduced as well to complement the research at hand. These include Lawrence Venuti’s discussions of foreiginizing and domesticating translation methods for literary translation, and Jesper Juul’s definition of games. Additionally, knowledge gathered from interviews with video game localization professionals in Japan during September and October 2009 is also utilized for this study. Apart from answering the aforementioned research questions, one of this thesis’ aims is to enrich the still rather small field of game localization studies, and the study of Japanese video games in particular, one of Japan’s most successful cultural exports.
Resumo:
The thesis focuses on the social interaction and behavior of the homeless living in Tokyo's Taito Ward. The study is based on the author's own ethnographic field research carried out in the autumn 2003. The chosen methodologies were based on the methodology called "participant observation", and they were used depending on the context. The ethnographic field research was carried out from the mid-August to the beginning of the October in 2003. The most important targets of the research were three separate loosely knit groups placed in certain parts of Taito Ward. One of these groups was based in proximity to the Ueno train station, one group gathered every morning around a homeless support organization called San'yûkai, and one was based in Tamahime Park located in the old San'ya area of Tokyo. The analysis is based on the aspects of Takie Sugiyama Lebra's theory of "social relativism". Lebra's theory consists of the following, arguably universal aspects: belongingness, empathy, dependence, place in the society, and reciprocity. In addition, all the interaction and behavior is tied to the context and the situation. According to Lebra, ritual and intimate situations produce similar action, which is socially relative. Of these, the norms of the ritual behavior are more regulated, while the intimate bahavior is less spontaneous. On the contrary, an anomic situation produces anomic behavior, which is not socially relative. Lebra's theory is critically reviewed by the author of the thesis, and the author has attempted to modify the theory to make it more adaptable to the present-day society and to the analysis. Erving Goffman's views of the social interaction and Anthony Giddens' theories about the social structures have been used as complementary thoretical basis. The aim of the thesis is to clarify, how and why the interaction and the behavior of some homeless individuals in some situations follow the aspects of Lebra's "social relativism", and on the other hand, why in some situations they do not. In the latter cases the answers can be sought from regional and individual differences, or from the inaptness of the theory to analyze the presented situation. Here, a significant factor is the major finding of the field study: the so called "homeless etiquette", which is an abstract set of norms and values that influences the social interaction and behavior of the homeless, and with which many homeless individuals presented in the study complied. The fundamental goal of the thesis is to reach profound understanding about the daily life of the homeless, whose lives were studied. The author argues that this kind of profound understanding is necessary in looking for sustainable solutions in the areas of social and housing policy to improve the position of the homeless and the qualitative functioning of the society.
Resumo:
This doctoral thesis focuses on the translation of Finnish prose literature into English in the United Kingdom between 1945 and 2003. The subject is approached using translation archaeology, interviews, archival material, detailed text analysis and reception material. The main theoretical framework is Descriptive Translation Studies, and certain sociological theories (Bourdieu s field theory, actor-network theory) are also used. After charting the published translations, two periods of time are selected for closer analysis: an earlier period from 1955 to 1959, involving eight translations, and a later one from 1990 to 2003, with a total of six translations. While these translation numbers may appear low, they are actually rather high in proportion to the total number of 28 one-author literary prose translations published in the UK over the approximately 60 years being studied. The two periods of time, the 1950s and 1990s, are compared in terms of the sociological context of translation activity, the reception of translations and their textual features. The comparisons show that the main changes in translation practice between these two periods are increased completeness (translations in the 1950s group often being shortened by hundreds of pages) and lesser use of indirect translation via an intermediary language (about half of the 1950s translations having been translated via Swedish). Otherwise, translation practices have not changed much: except for large omissions, which are far more frequent in the 1950s, variation within each group is larger than between groups. As to the sociological context, the main changes are an increase in long-term institution-level contacts and an increase in the promotion of foreign translation rights by Finnish publishing houses. This is in contrast to the 1950s when translation rights were mainly sold through personal contacts by individual authors and translators. The reception of translations is difficult to study because of scarce material. However, the 1950s translations were aggressively marketed and therefore obtained far more reviews and reprints than the 1990s translations. Several of the 1950s books, mostly historical novels by Mika Waltari, were mainstream bestsellers at the time, while current translations are frequently made for niche markets. The thesis introduces ample new material on the translation of Finnish prose literature into English in the UK. The results are also relevant to translation from a minority literature into a majority one. As to translation theory, they lead us to question the social nature of translation norms and the assumption of a static target culture. The translations analysed here are located in a very fragmented interculture and gain a stronger position in the Finnish culture than in the British one.
Resumo:
This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.
Resumo:
The dissertation "From Conceptual to Corporeal, from Quotation to Site: Painting and History of Contemporary Art" explores the state of painting in contemporary art and art theory since the 1960s. The purpose of the study is to re-consider the dominant "end of painting" -narrative in contemporary art history, which goes back to the modernist ideology of painting as a reductive, medium-specific form of art. Drawing on Michel Foucault´s concepts of discursive formation and archive, as well as Jean-Luc Nancy´s post-phenomenological philosophy on corporeality, I suggest that contemporary painting can be redefined as a discursive-sensuous practice. Instead of seeing painting as obsolete or over as an avantgarde art genre, I show that there have been alternative, neo-avantgardist ways of defining painting since the end of the 1960s, such as French artist Daniel Buren´s early writings on painting as "theoretical practice". Consequently, the tendency of the canonical Anglo-American contemporary art narratives to underestimate the historical and institutional codes of art can be questioned. This tendency can be seen, for example, in Rosalind Krauss´s influential theory on index. The study also reflects the relations between conceptual art and painting since the 1960s and maps recent theories of painting, which re-examine the genre´s possibilities after the modernist rhetoric. Concepts of "flatbed", "painting in the extended field", "as painting" and so on are compared critically with the idea of painting as discursive practice. It is also shown that the issues in painting arise from the contemporary critical art debate while the dematerialisation paradigm of conceptual art has dissolved. The study focuses on the corporeal-material-sensuous -cluster of meanings attached to painting and searches for its avantgardist possibilities as redefined by postfeminist and post-phenomenological discourse. The ideas of hierarchy of the senses and synesthesia are developed within the framework of Jean-Luc Nancy´s and Luce Irigaray´s thought. The parameters for the study have been Finnish painting from 1990 to 2002. On the Finnish art scene there has been no "end of painting" ideology, strictly speaking. The mythology and medium-specificity of modernism have been deconstructed since the mid-1980s, but "the archive" of painting, like themes of abstraction, formalism and synesthesia have been re-worked by the discursive practice of painting, for example, in the works of Nina Roos, Tarja Pitkänen-Walter and Jussi Niva.
Resumo:
My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.
Resumo:
The topic of my doctoral thesis is to demonstrate the usefulness of incorporating tonal and modal elements into a pitch-web square analysis of Béla Bartók's (1881-1945) opera, 'A kékszakállú herceg vára' ('Duke Bluebeard's Castle'). My specific goal is to demonstrate that different musical materials, which exist as foreground melodies or long-term key progressions, are unified by the unordered pitch set {0,1,4}, which becomes prominent in different sections of Bartók's opera. In Bluebeard's Castle, the set {0,1,4} is also found as a subset of several tetrachords: {0,1,4,7}, {0,1,4,8}, and {0,3,4,7}. My claim is that {0,1,4} serves to link music materials between themes, between sections, and also between scenes. This study develops an analytical method, drawn from various theoretical perspectives, for conceiving superposed diatonic spaces within a hybrid pitch-space comprised of diatonic and chromatic features. The integrity of diatonic melodic lines is retained, which allows for a non-reductive understanding of diatonic superposition, without appealing to pitch centers or specifying complete diatonic collections. Through combining various theoretical insights of the Hungarian scholar Ernő Lendvai, and the American theorists Elliott Antokoletz, Paul Wilson and Allen Forte, as well as the composer himself, this study gives a detailed analysis of the opera's pitch material in a way that combines, complements, and expands upon the studies of those scholars. The analyzed pitch sets are represented on Aarre Joutsenvirta's note-web square, which adds a new aspect to the field of Bartók analysis. Keywords: Bartók, Duke Bluebeard's Castle (Op. 11), Ernő Lendvai, axis system, Elliott Antokoletz, intervallic cycles, intervallic cells, Allen Forte, set theory, interval classes, interval vectors, Aarre Joutsenvirta, pitch-web square, pitch-web analysis.
Resumo:
In the eighteenth century, the birth of scientific societies in Europe created a new framework for scientific cooperation. Through a new contextualist study of the contacts between the first scientific societies in Sweden and the most important science academy in Europe at the time, l Académie des Sciences in Paris, this dissertation aims to shed light on the role taken by the Swedish learned men in the new networks. It seeks to show that the academy model was related to a new idea of specialisation in science. In the course of the eighteenth century, it is argued, the study of the northern phenomena and regions offered the Swedes an important field of speciality with regard to their foreign colleagues. Although historical studies have often underlined the economic, practical undertone of eighteenth-century Swedish science, participation in fashionable scientific pursuits had also become an important scene for representation. However, the views prevailing in Europe tied civilisation and learning closely to the sunnier, southern climates, which had lead to the difficulty of portraying Sweden as a learned country. The image of the scientific North, as well as the Swedish strategies to polish the image of the North as a place for science, are analysed as seen from France. While sixteenth-century historians had preferred to put down the effects of the cold and claim a similarity of northern conditions to the others, the scientific exchange between Swedish and French researchers shows a new tendency to underline the difference of the North and its harsh climate. An explanation is sought by analysing how information about northern phenomena was used in France. In the European academies, new empirical methods had lead to a need for direct observations on different phenomena and circumstances. Rather than curiosities or objects for exoticism, the eighteenth-century depictions of the northern periphery tell about an emerging interest in the most extreme, and often most telling, examples of the workings of the invariable laws of nature. Whereas the idea of accumulating knowledge through cooperation was most manifest in joint astronomical projects, the idea of gathering and comparing data from differing places of observation appears also in other fields, from experimental philosophy to natural studies or medicine. The effects of these developments are studied and explained in connection to the Montesquieuan climate theories and the emerging pre-romantic ideas of man and society.
Resumo:
The Uppsala school of Axel Hägerström can be said to have been the last genuinely Swedish philosophical movement. On the other hand, the Swedish analytic tradition is often said to have its roots in Hägerström s thought. This work examines the transformation from Uppsala philosophy to analytic philosophy from an actor-based historical perspective. The aim is to describe how a group of younger scholars (Ingemar Hedenius, Konrad Marc-Wogau, Anders Wedberg, Alf Ross, Herbert Tingsten, Gunnar Myrdal) colonised the legacy of Hägerström and Uppsala philosophy, and faced the challenges they met in trying to reconcile this legacy with the changing philosophical and political currents of the 1930s and 40s. Following Quentin Skinner, the texts are analysed as moves or speech acts in a particular historical context. The thesis consists of five previously published case studies and an introduction. The first study describes how the image of Hägerström as the father of the Swedish analytic tradition was created by a particular faction of younger Uppsala philosophers who (re-) presented the Hägerströmian philosophy as a parallel movement to logical empiricism. The second study examines the confrontations between Uppsala philosophy and logical empiricism in both the editorial board and in the pages of Sweden s leading philosophical journal Theoria. The third study focuses on how the younger generation redescribed Hägerströmian legal philosophical ideas (Scandinavian Legal Realism), while the fourth study discusses how they responded to the accusations of a connection between Hägerström s value nihilistic theory and totalitarianism. Finally, the fifth study examines how the Swedish social scientist and Social Democratic intellectual Gunnar Myrdal tried to reconcile value nihilism with a strong political programme for social reform. The contribution of this thesis to the field consists mainly in a re-evaluation of the role of Uppsala philosophy in the history of Swedish philosophy. From this perspective the Uppsala School was less a collection of certain definite philosophical ideas than an intellectual legacy that was the subject of fierce struggles. Its theories and ideas were redescribed in various ways by individual actors with different philosophical and political intentions.
Resumo:
Music as the Art of Anxiety: A Philosophical Approach to the Existential-Ontological Meaning of Music. The present research studies music as an art of anxiety from the points of view of both Martin Heidegger s thought and phenomenological philosophy in general. In the Heideggerian perspective, anxiety is understood as a fundamental mode of being (Grundbefindlichkeit) in human existence. Taken as an existential-ontological concept, anxiety is conceived philosophically and not psychologically. The central research questions are: what is the relationship between music and existential-ontological anxiety? In what way can music be considered as an art of anxiety? In thinking of music as a channel and manifestation of anxiety, what makes it a special case? What are the possible applications of phenomenology and Heideggerian thought in musicology? The main aim of the research is to develop a theory of music as an art of existential-ontological anxiety and to apply this theory to musicologically relevant phenomena. Furthermore, the research will contribute to contemporary musicological debates and research as it aims to outline the phenomenological study of music as a field of its own; the development of a specific methodology is implicit in these aims. The main subject of the study, a theory of music as an art of anxiety, integrates Heideggerian and phenomenological philosophies with critical and cultural theories concerning violence, social sacrifice, and mimetic desire (René Girard), music, noise and society (Jacques Attali), and the affect-based charme of music (Vladimir Jankélévitch). Thus, in addition to the subjective mood (Stimmung) of emptiness and meaninglessness, the philosophical concept of anxiety also refers to a state of disorder and chaos in general; for instance, to noise in the realm of sound and total (social) violence at the level of society. In this study, music is approached as conveying the existentially crucial human compulsion for signifying i.e., organizing chaos. In music, this happens primarily at the immediate level of experience, i.e. in affectivity, and also in relation to all of the aforementioned dimensions (sound, society, consciousness, and so on). Thus, music s existential-ontological meaning in human existence, Dasein, is in its ability to reveal different orders of existence as such. Indeed, this makes music the art of anxiety: more precisely, music can be existentially significant at the level of moods. The study proceeds from outlining the relevance of phenomenology and Heidegger s philosophy in musicology to the philosophical development of a theory of music as the art of anxiety. The theory is developed further through the study of three selected specific musical phenomena: the concept of a musical work, guitar smashing in the performance tradition of rock music, and Erik Bergman s orchestral work Colori ed improvvisazioni. The first example illustrates the level of individual human-subject in music as the art of anxiety, as a means of signifying chaos, while the second example focuses on the collective need to socio-culturally channel violence. The third example, being music-analytical, studies contemporary music s ability to mirror the structures of anxiety at the level of a specific musical text. The selected examples illustrate that, in addition to the philosophical orientation, the research also contributes to music analysis, popular music studies, and the cultural-critical study of music. Key words: music, anxiety, phenomenology, Martin Heidegger, ontology, guitar smashing, Erik Bergman, musical work, affectivity, Stimmung, René Girard
Resumo:
This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.
Resumo:
The point of departure in this dissertation was the practical safety problem of unanticipated, unfamiliar events and unexpected changes in the environment, the demanding situations which the operators should take care of in the complex socio-technical systems. The aim of this thesis was to increase the understanding of demanding situations and of the resources for coping with these situations by presenting a new construct, a conceptual model called Expert Identity (ExId) as a way to open up new solutions to the problem of demanding situations and by testing the model in empirical studies on operator work. The premises of the Core-Task Analysis (CTA) framework were adopted as a starting point: core-task oriented working practices promote the system efficiency (incl. safety, productivity and well-being targets) and that should be supported. The negative effects of stress were summarised and the possible countermeasures related to the operators' personal resources such as experience, expertise, sense of control, conceptions of work and self etc. were considered. ExId was proposed as a way to bring emotional-energetic depth into the work analysis and to supplement CTA-based practical methods to discover development challenges and to contribute to the development of complex socio-technical systems. The potential of ExId to promote understanding of operator work was demonstrated in the context of the six empirical studies on operator work. Each of these studies had its own practical objectives within the corresponding quite broad focuses of the studies. The concluding research questions were: 1) Are the assumptions made in ExId on the basis of the different theories and previous studies supported by the empirical findings? 2) Does the ExId construct promote understanding of the operator work in empirical studies? 3) What are the strengths and weaknesses of the ExId construct? The layers and the assumptions of the development of expert identity appeared to gain evidence. The new conceptual model worked as a part of an analysis of different kinds of data, as a part of different methods used for different purposes, in different work contexts. The results showed that the operators had problems in taking care of the core task resulting from the discrepancy between the demands and resources (either personal or external). The changes of work, the difficulties in reaching the real content of work in the organisation and the limits of the practical means of support had complicated the problem and limited the possibilities of the development actions within the case organisations. Personal resources seemed to be sensitive to the changes, adaptation is taking place, but not deeply or quickly enough. Furthermore, the results showed several characteristics of the studied contexts that complicated the operators' possibilities to grow into or with the demands and to develop practices, expertise and expert identity matching the core task. They were: discontinuation of the work demands, discrepancy between conceptions of work held in the other parts of organisation, visions and the reality faced by the operators, emphasis on the individual efforts and situational solutions. The potential of ExId to open up new paths to solving the problem of the demanding situations and its ability to enable studies on practices in the field was considered in the discussion. The results were interpreted as promising enough to encourage the conduction of further studies on ExId. This dissertation proposes especially contribution to supporting the workers in recognising the changing demands and their possibilities for growing with them when aiming to support human performance in complex socio-technical systems, both in designing the systems and solving the existing problems.
Resumo:
Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.
Resumo:
In the 1990 s the companies utilizing and producing new information technology, especially so-called new media, were also expected to be forerunners in new forms of work and organization. Researchers anticipated that new, more creative forms of work and the changing content of working life were about to replace old industrial and standardized ways of working. However, research on actual companies in the IT sector revealed a situation where only minor changes to existing organizational forms were seen .Many of the independent companies faced great difficulties trying to survive the rapid changes in the products and production forms in the emerging field. Most of the research on the new media field has been conducted as surveys, and an understanding of the actual everyday work process has remained thin. My research is a longitudinal study of the early phases of one new media company in Finland. The study is an analysis of the challenges the company faced in a rapidly changing business field and the attempts to overcome these challenges. The two main analyses in the study focus on the developmental phases of the company and the disturbances in the production process. Based on these analyses, I study changes and learning at work using the methodological framework of developmental work research. Developmental work research is a Finnish variant of the cultural-historical activity theory applied to the study of learning and transformations at work. The data was gathered over a three-year period of ethnographic fieldwork. I documented the production processes and everyday life in the company as a participant observer. I interviewed key persons, video and audio-taped meetings, followed e-mail correspondence and collected various documents, such as agreements and memos. I developed a systematic method for analyzing the disturbances in the production process by combining the various data sources. The systematic analysis of the disturbances depicted a very complex and only partly managed production process. The production process had a long duration, and no single actor had an understanding of it as a whole. Most of the disturbances had to do with the customer relationships. The nature of the disturbances was latent; they were recognized but not addressed. In the particular production processes that I analyzed, the ending life span of a particular product, a CD-ROM, became obvious. This finding can be interpreted in relation to the developmental phase of the production and the transformation of the field as a whole. Based on the analysis of the developmental phases and the disturbances, I formulate a hypothesis of the contradictions and developmental potentials of the activity studied. The conclusions of the study challenge the existing understanding of how to conceptualize and study organizational learning in production work. Most theories of organizational learning do not address qualitative changes in production nor historical challenges of organizational learning itself. My study opens up a new horizon in understanding organizational learning in a rapidly changing field where a learning culture based on craft or mass production work is insufficient. There is a need for anticipatory and proactive organizational learning. Proactive learning is needed to anticipate the changes in production type, and the life cycles of products.
Resumo:
This thesis examines brain networks involved in auditory attention and auditory working memory using measures of task performance, brain activity, and neuroanatomical connectivity. Auditory orienting and maintenance of attention were compared with visual orienting and maintenance of attention, and top-down controlled attention was compared to bottom-up triggered attention in audition. Moreover, the effects of cognitive load on performance and brain activity were studied using an auditory working memory task. Corbetta and Shulman s (2002) model of visual attention suggests that what is known as the dorsal attention system (intraparietal sulcus/superior parietal lobule, IPS/SPL and frontal eye field, FEF) is involved in the control of top-down controlled attention, whereas what is known as the ventral attention system (temporo-parietal junction, TPJ and areas of the inferior/middle frontal gyrus, IFG/MFG) is involved in bottom-up triggered attention. The present results show that top-down controlled auditory attention also activates IPS/SPL and FEF. Furthermore, in audition, TPJ and IFG/MFG were activated not only by bottom-up triggered attention, but also by top-down controlled attention. In addition, the posterior cerebellum and thalamus were activated by top-down controlled attention shifts and the ventromedial prefrontal cortex (VMPFC) was activated by to-be-ignored, but attention-catching salient changes in auditory input streams. VMPFC may be involved in the evaluation of environmental events causing the bottom-up triggered engagement of attention. Auditory working memory activated a brain network that largely overlapped with the one activated by top-down controlled attention. The present results also provide further evidence of the role of the cerebellum in cognitive processing: During auditory working memory tasks, both activity in the posterior cerebellum (the crus I/II) and reaction speed increased when the cognitive load increased. Based on the present results and earlier theories on the role of the cerebellum in cognitive processing, the function of the posterior cerebellum in cognitive tasks may be related to the optimization of response speed.