485 resultados para Digital texts
Resumo:
Many governments world wide are attempting to increase accountability, transparency, and the quality of services by adopting information and communications technologies (ICTs) to modernize and change the way their administrations work. Meanwhile e-government is becoming a significant decision-making and service tool at local, regional and national government levels. The vast majority of users of these government online services see significant benefits from being able to access services online. The rapid pace of technological development has created increasingly more powerful ICTs that are capable of radically transforming public institutions and private organizations alike. These technologies have proven to be extraordinarily useful instruments in enabling governments to enhance the quality, speed of delivery and reliability of services to citizens and to business (VanderMeer & VanWinden, 2003). However, just because the technology is available does not mean it is accessible to all. The term digital divide has been used since the 1990s to describe patterns of unequal access to ICTs—primarily computers and the Internet—based on income, ethnicity, geography, age, and other factors. Over time it has evolved to more broadly define disparities in technology usage, resulting from a lack of access, skills, or interest in using technology. This article provides an overview of recent literature on e-government and the digital divide, and includes a discussion on the potential of e-government in addressing the digital divide.
Resumo:
One of Cultural Studies' most important contributions to academic thinking about culture is the acceptance as axiomatic that we must not simply accept traditional value hierarchies in relation to cultural objects (see, for example, McGuigan, 1992: 157; Brunsdon, 1997: 5; Wark, 2001). Since Richard Hoggart and Raymond Williams took popular culture as a worthy object of study, Cultural Studies practitioners have accepted that the terms in which cultural debate had previously been conducted involved a category error. Opera is not 'better' than pop music, we believe in Cultural Studies - 'better for what?', we would ask. Similarly, Shakespeare is not 'better' than Mills and Boon, unless you can specify the purpose for which you want to use the texts. Shakespeare is indeed better than Mills and Boon for understanding seventeenth century ideas about social organisation; but Mills and Boon is unquestionably better than Shakespeare if you want slightly scandalous, but ultimately reassuring representations of sexual intercourse. The reason that we do not accept traditional hierarchies of cultural value is that we know that the culture that is commonly understood to be 'best' also happens to be that which is preferred by the most educated and most materially well-off people in any given culture (Bourdieu, 1984: 1- 2; Ross, 1989: 211). We can interpret this information in at least two ways. On the one hand, it can be read as proving that the poorer and less well-educated members of a society do indeed have tastes which are innately less worthwhile than those of the material and educational elite. On the other hand, this information can be interpreted as demonstrating that the cultural and material elite publicly represent their own tastes as being the only correct ones. In Cultural Studies, we tend to favour the latter interpretation. We reject the idea that cultural objects have innate value, in terms of beauty, truth, excellence, simply 'there' in the object. That is, we reject 'aesthetic' approaches to culture (Bourdieu, 1984: 6; 485; Hartley, 1994: 6)1. In this, Cultural Studies is similar to other postmodern institutions, where high and popular culture can be mixed in ways unfamiliar to modernist culture (Sim, 1992: 1; Jameson, 1998: 100). So far, so familiar.
Resumo:
The Reporting and Reception of Indigenous Issues in the Australian Media was a three year project financed by the Australian government through its Australian Research Council Large Grants Scheme and run by Professor John Hartley (of Murdoch and then Edith Cowan University, Western Australia). The purpose of the research was to map the ways in which indigeneity was constructed and circulated in Australia's mediasphere. The analysis of the 'reporting' element of the project was almost straightforward: a mixture of content analysis of a large number of items in the media, and detailed textual analysis of a smaller number of key texts. The discoveries were interesting - that when analysis approaches the media as a whole, rather than focussing exclusively on news or serious drama genres, then representation of indigeneity is not nearly as homogenous as has previously been assumed. And if researchers do not explicitly set out to uncover racism in every text, it is by no means guaranteed they will find it1. The question of how to approach the 'reception' of these issues - and particularly reception by indigenous Australians - proved to be a far more challenging one. In attempting to research this area, Hartley and I (working as a research assistant on the project) often found ourselves hampered by the axioms that underlie much media research. Traditionally, the 'reception' of media by indigenous people in Australia has been researched in ethnographic ways. This research repeatedly discovers that indigenous people in Australia are powerless in the face of new forms of media. Indigenous populations are represented as victims of aggressive and powerful intrusions: ‘What happens when a remote community is suddenly inundated by broadcast TV?’; ‘Overnight they will go from having no radio and television to being bombarded by three TV channels’; ‘The influence of film in an isolated, traditionally oriented Aboriginal community’2. This language of ‘influence’, ‘bombarded’, and ‘inundated’, presents metaphors not just of war but of a war being lost. It tells of an unequal struggle, of a more powerful force impinging upon a weaker one. What else could be the relationship of an Aboriginal audience to something which is ‘bombarding’ them? Or by which they are ‘inundated’? This attitude might best be summed up by the title of an article by Elihu Katz: ‘Can authentic cultures survive new media?’3. In such writing, there is little sense that what is being addressed might be seen as a series of discursive encounters, negotiations and acts of meaning-making in which indigenous people — communities and audiences —might be productive. Certainly, the points of concern in this type of writing are important. The question of what happens when a new communication medium is summarily introduced to a culture is certainly an important one. But the language used to describe this interaction is a misleading one. And it is noticeable that such writing is fascinated with the relationship of only traditionally-oriented Aboriginal communities to the media of mass communication.
Resumo:
The ‘anti- of ‘(Anti)Queer’ is a queer anti. In particle physics, a domain of science which was for a long time peddled as ultimately knowable, rational and objective, the postmodern turn has made everything queer (or chaotic, as the scientific version of this turn is perhaps more commonly named). This is a world where not only do two wrongs not make a right, but a negative and positive do not calmly cancel each other out to leave nothing, as mathematics might suggest. When matter meets with anti-matter, the resulting explosion can produce not only energy - heat and light? - but new matter. We live in a world whose very basics are no longer the electron and the positron, but an ever proliferating number of chaotic, unpredictable - queer? - subatomic particles. Some are ‘charmed’, others merely ‘strange’ . Weird science indeed. The ‘Anti-’ of ‘Anti-queer’ does not place itself neatly into binaries. This is not a refutation of all that queer has been or will be. It is explicitly a confrontation, a challenge, an attempt to take seriously not only the claims made for queer but the potent contradictions and silences which stand proudly when any attempt is made to write a history of the term. Specifically, ‘Anti-Queer’ is not Beyond Queer, the title of Bruce Bawer’s 1996 book which calmly and self-confidently explains the failings of queer, extols a return to a liberal political theory of cultural change and places its own marker on queer as a movement whose purpose has been served. We are not Beyond Queer. And if we are Anti-Queer, it is only to challenge those working in the arena to acknowledge and work with some of the facts of the movement’s history whose productivity has been erased with a gesture which has, proved, bizarrely, to be reductive and homogenising.
Resumo:
New technologies have the potential to both expose children to and protect them from television news footage likely to disturb or frighten. The advent of cheap, portable and widely available digital technology has vastly increased the possibility of violent news events being captured and potentially broadcast. This material has the potential to be particularly disturbing and harmful to young children. But on the flipside, available digital technology could be used to build in protection for young viewers especially when it comes to preserving scheduled television programming and guarding against violent content being broadcast during live crosses from known trouble spots. Based on interviews with news directors, parents and a review of published material two recommendations are put forward: 1. Digital television technology should be employed to prevent news events "overtaking" scheduled children's programming and to protect safe harbours placed in the classifications zones to protect children. 2. Broadcasters should regain control of the images that go to air during "live" feeds from obviously volatile situations by building in short delays in G classification zones.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
The emergence of mobile and ubiquitous computing has created what is referred to as a hybrid space – a virtual layer of digital information and interaction opportunities that sits on top and augments the physical environment. The increasing connectedness through such media, from anywhere to anybody at anytime, makes us less dependent on being physically present somewhere in particular. But, what is the role of ubiquitous computing in making physical presence at a particular place more attractive? Acknowledging historic context and identity as important attributes of place, this work embarks on a ‘global sense of place’ in which the cultural diversity, multiple identities, backgrounds, skills and experiences of people traversing a place are regarded as social assets of that place. The aim is to explore ways how physical architecture and infrastructure of a place can be mediated towards making invisible social assets visible, thus augmenting people’s situated social experience. Thereby, the focus is on embodied media, i.e. media that materialise digital information as observable and sometimes interactive parts of the physical environment hence amplify people’s real world experience, rather than substituting or moving it to virtual spaces.
Resumo:
The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.
Resumo:
The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.
Resumo:
Language is a unique aspect of human communication because it can be used to discuss itself in its own terms. For this reason, human societies potentially have superior capacities of co-ordination, reflexive self-correction, and innovation than other animal, physical or cybernetic systems. However, this analysis also reveals that language is interconnected with the economically and technologically mediated social sphere and hence is vulnerable to abstraction, objectification, reification, and therefore ideology – all of which are antithetical to its reflexive function, whilst paradoxically being a fundamental part of it. In particular, in capitalism, language is increasingly commodified within the social domains created and affected by ubiquitous communication technologies. The advent of the so-called ‘knowledge economy’ implicates exchangeable forms of thought (language) as the fundamental commodities of this emerging system. The historical point at which a ‘knowledge economy’ emerges, then, is the critical point at which thought itself becomes a commodified ‘thing’, and language becomes its “objective” means of exchange. However, the processes by which such commodification and objectification occurs obscures the unique social relations within which these language commodities are produced. The latest economic phase of capitalism – the knowledge economy – and the obfuscating trajectory which accompanies it, we argue, is destroying the reflexive capacity of language particularly through the process of commodification. This can be seen in that the language practices that have emerged in conjunction with digital technologies are increasingly non-reflexive and therefore less capable of self-critical, conscious change.