947 resultados para Digital communications.
Resumo:
This is a review of painter Andrzej Zielinski's exhibition at gallery 9 in Sydney. It highlights the artist's expressionistic style and strong colour sense as well as his association with American painterly traditions. The artist application of acrylic modelling paste and his paintings also gives them a sculptural and architectural dimension, and on a conceptual level play with notions of mimesis and material form.
Resumo:
The Queensland University of Technology (QUT) allows the presentation of a thesis for the Degree of Doctor of Philosophy in the format of published or submitted papers, where such papers have been published, accepted or submitted during the period of candidature. This thesis is composed of seven published/submitted papers, of which one has been published, three accepted for publication and the other three are under review. This project is financially supported by an Australian Research Council (ARC) Discovery Grant with the aim of proposing strategies for the performance control of Distributed Generation (DG) system with digital estimation of power system signal parameters. Distributed Generation (DG) has been recently introduced as a new concept for the generation of power and the enhancement of conventionally produced electricity. Global warming issue calls for renewable energy resources in electricity production. Distributed generation based on solar energy (photovoltaic and solar thermal), wind, biomass, mini-hydro along with use of fuel cell and micro turbine will gain substantial momentum in the near future. Technically, DG can be a viable solution for the issue of the integration of renewable or non-conventional energy resources. Basically, DG sources can be connected to local power system through power electronic devices, i.e. inverters or ac-ac converters. The interconnection of DG systems to power system as a compensator or a power source with high quality performance is the main aim of this study. Source and load unbalance, load non-linearity, interharmonic distortion, supply voltage distortion, distortion at the point of common coupling in weak source cases, source current power factor, and synchronism of generated currents or voltages are the issues of concern. The interconnection of DG sources shall be carried out by using power electronics switching devices that inject high frequency components rather than the desired current. Also, noise and harmonic distortions can impact the performance of the control strategies. To be able to mitigate the negative effect of high frequency and harmonic as well as noise distortion to achieve satisfactory performance of DG systems, new methods of signal parameter estimation have been proposed in this thesis. These methods are based on processing the digital samples of power system signals. Thus, proposing advanced techniques for the digital estimation of signal parameters and methods for the generation of DG reference currents using the estimates provided is the targeted scope of this thesis. An introduction to this research – including a description of the research problem, the literature review and an account of the research progress linking the research papers – is presented in Chapter 1. One of the main parameters of a power system signal is its frequency. Phasor Measurement (PM) technique is one of the renowned and advanced techniques used for the estimation of power system frequency. Chapter 2 focuses on an in-depth analysis conducted on the PM technique to reveal its strengths and drawbacks. The analysis will be followed by a new technique proposed to enhance the speed of the PM technique while the input signal is free of even-order harmonics. The other techniques proposed in this thesis as the novel ones will be compared with the PM technique comprehensively studied in Chapter 2. An algorithm based on the concept of Kalman filtering is proposed in Chapter 3. The algorithm is intended to estimate signal parameters like amplitude, frequency and phase angle in the online mode. The Kalman filter is modified to operate on the output signal of a Finite Impulse Response (FIR) filter designed by a plain summation. The frequency estimation unit is independent from the Kalman filter and uses the samples refined by the FIR filter. The frequency estimated is given to the Kalman filter to be used in building the transition matrices. The initial settings for the modified Kalman filter are obtained through a trial and error exercise. Another algorithm again based on the concept of Kalman filtering is proposed in Chapter 4 for the estimation of signal parameters. The Kalman filter is also modified to operate on the output signal of the same FIR filter explained above. Nevertheless, the frequency estimation unit, unlike the one proposed in Chapter 3, is not segregated and it interacts with the Kalman filter. The frequency estimated is given to the Kalman filter and other parameters such as the amplitudes and phase angles estimated by the Kalman filter is taken to the frequency estimation unit. Chapter 5 proposes another algorithm based on the concept of Kalman filtering. This time, the state parameters are obtained through matrix arrangements where the noise level is reduced on the sample vector. The purified state vector is used to obtain a new measurement vector for a basic Kalman filter applied. The Kalman filter used has similar structure to a basic Kalman filter except the initial settings are computed through an extensive math-work with regards to the matrix arrangement utilized. Chapter 6 proposes another algorithm based on the concept of Kalman filtering similar to that of Chapter 3. However, this time the initial settings required for the better performance of the modified Kalman filter are calculated instead of being guessed by trial and error exercises. The simulations results for the parameters of signal estimated are enhanced due to the correct settings applied. Moreover, an enhanced Least Error Square (LES) technique is proposed to take on the estimation when a critical transient is detected in the input signal. In fact, some large, sudden changes in the parameters of the signal at these critical transients are not very well tracked by Kalman filtering. However, the proposed LES technique is found to be much faster in tracking these changes. Therefore, an appropriate combination of the LES and modified Kalman filtering is proposed in Chapter 6. Also, this time the ability of the proposed algorithm is verified on the real data obtained from a prototype test object. Chapter 7 proposes the other algorithm based on the concept of Kalman filtering similar to those of Chapter 3 and 6. However, this time an optimal digital filter is designed instead of the simple summation FIR filter. New initial settings for the modified Kalman filter are calculated based on the coefficients of the digital filter applied. Also, the ability of the proposed algorithm is verified on the real data obtained from a prototype test object. Chapter 8 uses the estimation algorithm proposed in Chapter 7 for the interconnection scheme of a DG to power network. Robust estimates of the signal amplitudes and phase angles obtained by the estimation approach are used in the reference generation of the compensation scheme. Several simulation tests provided in this chapter show that the proposed scheme can very well handle the source and load unbalance, load non-linearity, interharmonic distortion, supply voltage distortion, and synchronism of generated currents or voltages. The purposed compensation scheme also prevents distortion in voltage at the point of common coupling in weak source cases, balances the source currents, and makes the supply side power factor a desired value.
Resumo:
This paper considers issues of methodological innovation in communication, media and cultural studies, that arise out of the extent to which we now live in a media environment characterised by an digital media abundance, the convergence of media platforms, content and services, and the globalisation of media content through ubiquitous computing and high-speed broadband networks. These developments have also entailed a shift in the producer-consumer relationships that characterised the 20th century mass communications paradigm, with the rapid proliferation of user-created content, accelerated innovation, the growing empowerment of media users themselves, and the blurring of distinctions between public and private, as well as age-based distinctions in terms of what media can be accessed by whom and for what purpose. It considers these issues through a case study of the Australian Law Reform Commission's National Classification Scheme Review.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
In this article I would like to examine the promise and possibilities of music, digital media and National Broadband Network. I will do this based on concepts that have emerged from a study undertaken by Professor Andrew Brown and I that categorise technologies into what we term representational technologies and technologies with agency
Resumo:
Advances in information and communication technologies have brought about an information revolution, leading to fundamental changes in the way information is collected or generated, shared and distributed. The internet and digital technologies are re-shaping research, innovation and creativity. Economic research has highlighted the importance of information flows and the availability of information for access and re-use. Information is crucial to the efficiency of markets and enhanced information flows promote creativity, innovation and productivity. There is a rapidly expanding body of literature which supports the economic and social benefits of enabling access to and re-use of public sector information.1 (Note that a substantial research project associated with QUT’s Intellectual Property: Knowledge, Culture and Economy (IPKCE) Research Program is engaged in a comprehensive study and analysis of the literature on the economics of access to public sector information.)
Resumo:
While the 2007 Australian federal election was notable for the use of social media by the Australian Labor Party in campaigning, the 2010 election took place in a media landscape in which social media–especially Twitter–had become much more embedded in both political journalism and independent political commentary. This article draws on the computer-aided analysis of election-related Twitter messages, collected under the #ausvotes hashtag, to describe the key patterns of activity and thematic foci of the election’s coverage in this particular social media site. It introduces novel metrics for analysing public communication via Twitter, and describes the related methods. What emerges from this analysis is the role of the #ausvotes hashtag as a means of gathering an ad hoc ‘issue public’– a finding which is likely to be replicated for other hashtag communities.
Resumo:
Twitter is now well established as the world’s second most important social media platform, after Facebook. Its 140-character updates are designed for brief messaging, and its network structures are kept relatively flat and simple: messages from users are either public and visible to all (even to unregistered visitors using the Twitter website), or private and visible only to approved ‘followers’ of the sender; there are no more complex definitions of degrees of connection (family, friends, friends of friends) as they are available in other social networks. Over time, Twitter users have developed simple, but effective mechanisms for working around these limitations: ‘#hashtags’, which enable the manual or automatic collation of all tweets containing the same #hashtag, as well allowing users to subscribe to content feeds that contain only those tweets which feature specific #hashtags; and ‘@replies’, which allow senders to direct public messages even to users whom they do not already follow. This paper documents a methodology for extracting public Twitter activity data around specific #hashtags, and for processing these data in order to analyse and visualize the @reply networks existing between participating users – both overall, as a static network, and over time, to highlight the dynamic structure of @reply conversations. Such visualizations enable us to highlight the shifting roles played by individual participants, as well as the response of the overall #hashtag community to new stimuli – such as the entry of new participants or the availability of new information. Over longer timeframes, it is also possible to identify different phases in the overall discussion, or the formation of distinct clusters of preferentially interacting participants.
Resumo:
Learning a digital tool is often a hidden process. We tend to learn new tools in a bewildering range of ways. Formal, informal, structured, random, conscious, unconscious, individual, group strategies, may all play a part, but are often lost to us in the complex and demanding processes of learning. But when we reflect carefully on the experience, some patterns and surprising techniques emerge. This monograph presents the thinking of four students in MDN642, Digital Pedagogies, where they have deliberately reflected on the mental processes at work as they learnt a digital technology of their choice.
Resumo:
Our research explores the design of networked technologies to facilitate local suburban communications and to encourage people to engage with their local community. While there are many investigations of interaction designs for networked technologies, most research utilises small exercises, workshops or other short-term studies to investigate interaction designs. However, we have found these short-term methods to be ineffective in the context of understanding local community interaction. Moreover we find that people are resistant to putting their time into workshops and exercises, understandably so because these are academic practices, not local community practices. Our contribution is to detail a long term embedded design approach in which we interact with the community over the long term in the course of normal community goings-on with an evolving exploratory prototype. This paper discusses the embedded approach to working in the wild for extended field research.
Resumo:
This paper reports on an exploratory study of the role of web and social media in e-governments, especially in the context of Malaysia, with some comparisons and contrasts from other countries where such governmental efforts have been underway for awhile. It describes the current e-government efforts in Malaysia, and proposes that applying a theoretical framework would help understand the context and streamline these ongoing efforts. Specifically, it lays out a theoretical and cultural framework based on Mary Douglas’ (1996) Grid-Group Theory, Mircea Georgescu’s (2005) Three Pillars of E-Government, and Gerald Grant’s and Derek Chau’s (2006) Generic Framework for E-Government. Although this study is in its early stages, it has relevance to everyone who is interested in e-government efforts across the world, and especially relevant to developing countries.
Resumo:
Innovation processes are rarely smooth and disruptions often occur at transition points were one knowledge domain passes the technology on to another domain. At these transition points communication is a key component in assisting the smooth hand over of technologies. However for smooth transitions to occur we argue that appropriate structures have to be in place and boundary spanning activities need to be facilitated. This paper presents three case studies of innovation processes and the findings support the view that structures and boundary spanning are essential for smooth transitions. We have explained the need to pass primary responsibility between agents to successfully bring an innovation to market. We have also shown the need to combine knowledge through effective communication so that absorptive capacity is built in process throughout the organisation rather than in one or two key individuals.
Resumo:
These digital stories were produced during a commercial research project with SLQ and Flying Arts. The works build on the research of Klaebe and Burgess related to variable workshop scenarios, and the institutional contexts of co-creative media. In this instance, research focused on the distributed digital storytelling workshop model; and the development of audiences for digital storytelling. The research team worked with regional artists whose work had been selected for inclusion in the Five Senses exhibition held at the State Library of Queensland to produce stories about their work; these works were then in turn integrated into the physical exhibition space. Location remoteness and timeline were factors in how the stories were made in a mix of individual meetings and remote correspondence (email &phone).
Resumo:
Historically, determining the country of origin of a published work presented few challenges, because works were generally published physically – whether in print or otherwise – in a distinct location or few locations. However, publishing opportunities presented by new technologies mean that we now live in a world of simultaneous publication – works that are first published online are published simultaneously to every country in world in which there is Internet connectivity. While this is certainly advantageous for the dissemination and impact of information and creative works, it creates potential complications under the Berne Convention for the Protection of Literary and Artistic Works (“Berne Convention”), an international intellectual property agreement to which most countries in the world now subscribe. Under the Berne Convention’s national treatment provisions, rights accorded to foreign copyright works may not be subject to any formality, such as registration requirements (although member countries are free to impose formalities in relation to domestic copyright works). In Kernel Records Oy v. Timothy Mosley p/k/a Timbaland, et al. however, the Florida Southern District Court of the United States ruled that first publication of a work on the Internet via an Australian website constituted “simultaneous publication all over the world,” and therefore rendered the work a “United States work” under the definition in section 101 of the U.S. Copyright Act, subjecting the work to registration formality under section 411. This ruling is in sharp contrast with an earlier decision delivered by the Delaware District Court in Håkan Moberg v. 33T LLC, et al. which arrived at an opposite conclusion. The conflicting rulings of the U.S. courts reveal the problems posed by new forms of publishing online and demonstrate a compelling need for further harmonization between the Berne Convention, domestic laws and the practical realities of digital publishing. In this article, we argue that even if a work first published online can be considered to be simultaneously published all over the world it does not follow that any country can assert itself as the “country of origin” of the work for the purpose of imposing domestic copyright formalities. More specifically, we argue that the meaning of “United States work” under the U.S. Copyright Act should be interpreted in line with the presumption against extraterritorial application of domestic law to limit its application to only those works with a real and substantial connection to the United States. There are gaps in the Berne Convention’s articulation of “country of origin” which provide scope for judicial interpretation, at a national level, of the most pragmatic way forward in reconciling the goals of the Berne Convention with the practical requirements of domestic law. We believe that the uncertainties arising under the Berne Convention created by new forms of online publishing can be resolved at a national level by the sensible application of principles of statutory interpretation by the courts. While at the international level we may need a clearer consensus on what amounts to “simultaneous publication” in the digital age, state practice may mean that we do not yet need to explore textual changes to the Berne Convention.
Resumo:
Google, Facebook, Twitter, LinkedIn, etc. are some of the prominent large-scale digital service providers that are having tremendous impact on societies, corporations and individuals. However, despite the rapid uptake and their obvious influence on the behavior of individuals and the business models and networks of organizations, we still lack a deeper, theory-guided understanding of the related phenomenon. We use Teece’s notion of complementary assets and extend it towards ‘digital complementary assets’ (DCA) in an attempt to provide such a theory-guided understanding of these digital services. Building on Teece’s theory, we make three contributions. First, we offer a new conceptualization of digital complementary assets in the form of digital public goods and digital public assets. Second, we differentiate three models for how organizations can engage with such digital complementary assets. Third, user-base is found to be a critical factor when considering appropriability.