952 resultados para Digital Reference Service
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
Objectives: Comparatively few people with severe mental illness are employed despite evidence that many people within this group wish to obtain, can obtain and sustain employment, and that employment can contribute to recovery. This investigation aimed to: (i) describe the current policy and service environment within which people with severe mental illness receive employment services; (ii) identify evidence-based practices that improve employment outcomes for people with severe mental illness; (iii) determine the extent to which the current Australian policy environment is consistent with the implementation of evidence-based employment services for people with severe mental illness; and (iv) identify methods and priorities for enhancing employment services for Australians with severe mental illness through implementation of evidence-based practices. Method: Current Australian practices were identified, having reference to policy and legal documents, funding body requirements and anecdotal reports. Evidence-based employment services for people with severe mental illness were identified through examination of published reviews and the results of recent controlled trials. Results: Current policy settings support the provision of employment services for people with severe mental illness separate from clinical services. Recent studies have identified integration of clinical and employment services as a major factor in the effectiveness of employment services. This is usually achieved through co-location of employment and mental health services. Conclusions: Optimal evidence-based employment services are needed by Australians with severe mental illness. Providing optimal services is a challenge in the current policy environment. Service integration may be achieved through enhanced intersectoral links between employment and mental health service providers as well as by co-locating employment specialists within a mental health care setting.
Resumo:
The Queensland University of Technology (QUT) allows the presentation of a thesis for the Degree of Doctor of Philosophy in the format of published or submitted papers, where such papers have been published, accepted or submitted during the period of candidature. This thesis is composed of seven published/submitted papers, of which one has been published, three accepted for publication and the other three are under review. This project is financially supported by an Australian Research Council (ARC) Discovery Grant with the aim of proposing strategies for the performance control of Distributed Generation (DG) system with digital estimation of power system signal parameters. Distributed Generation (DG) has been recently introduced as a new concept for the generation of power and the enhancement of conventionally produced electricity. Global warming issue calls for renewable energy resources in electricity production. Distributed generation based on solar energy (photovoltaic and solar thermal), wind, biomass, mini-hydro along with use of fuel cell and micro turbine will gain substantial momentum in the near future. Technically, DG can be a viable solution for the issue of the integration of renewable or non-conventional energy resources. Basically, DG sources can be connected to local power system through power electronic devices, i.e. inverters or ac-ac converters. The interconnection of DG systems to power system as a compensator or a power source with high quality performance is the main aim of this study. Source and load unbalance, load non-linearity, interharmonic distortion, supply voltage distortion, distortion at the point of common coupling in weak source cases, source current power factor, and synchronism of generated currents or voltages are the issues of concern. The interconnection of DG sources shall be carried out by using power electronics switching devices that inject high frequency components rather than the desired current. Also, noise and harmonic distortions can impact the performance of the control strategies. To be able to mitigate the negative effect of high frequency and harmonic as well as noise distortion to achieve satisfactory performance of DG systems, new methods of signal parameter estimation have been proposed in this thesis. These methods are based on processing the digital samples of power system signals. Thus, proposing advanced techniques for the digital estimation of signal parameters and methods for the generation of DG reference currents using the estimates provided is the targeted scope of this thesis. An introduction to this research – including a description of the research problem, the literature review and an account of the research progress linking the research papers – is presented in Chapter 1. One of the main parameters of a power system signal is its frequency. Phasor Measurement (PM) technique is one of the renowned and advanced techniques used for the estimation of power system frequency. Chapter 2 focuses on an in-depth analysis conducted on the PM technique to reveal its strengths and drawbacks. The analysis will be followed by a new technique proposed to enhance the speed of the PM technique while the input signal is free of even-order harmonics. The other techniques proposed in this thesis as the novel ones will be compared with the PM technique comprehensively studied in Chapter 2. An algorithm based on the concept of Kalman filtering is proposed in Chapter 3. The algorithm is intended to estimate signal parameters like amplitude, frequency and phase angle in the online mode. The Kalman filter is modified to operate on the output signal of a Finite Impulse Response (FIR) filter designed by a plain summation. The frequency estimation unit is independent from the Kalman filter and uses the samples refined by the FIR filter. The frequency estimated is given to the Kalman filter to be used in building the transition matrices. The initial settings for the modified Kalman filter are obtained through a trial and error exercise. Another algorithm again based on the concept of Kalman filtering is proposed in Chapter 4 for the estimation of signal parameters. The Kalman filter is also modified to operate on the output signal of the same FIR filter explained above. Nevertheless, the frequency estimation unit, unlike the one proposed in Chapter 3, is not segregated and it interacts with the Kalman filter. The frequency estimated is given to the Kalman filter and other parameters such as the amplitudes and phase angles estimated by the Kalman filter is taken to the frequency estimation unit. Chapter 5 proposes another algorithm based on the concept of Kalman filtering. This time, the state parameters are obtained through matrix arrangements where the noise level is reduced on the sample vector. The purified state vector is used to obtain a new measurement vector for a basic Kalman filter applied. The Kalman filter used has similar structure to a basic Kalman filter except the initial settings are computed through an extensive math-work with regards to the matrix arrangement utilized. Chapter 6 proposes another algorithm based on the concept of Kalman filtering similar to that of Chapter 3. However, this time the initial settings required for the better performance of the modified Kalman filter are calculated instead of being guessed by trial and error exercises. The simulations results for the parameters of signal estimated are enhanced due to the correct settings applied. Moreover, an enhanced Least Error Square (LES) technique is proposed to take on the estimation when a critical transient is detected in the input signal. In fact, some large, sudden changes in the parameters of the signal at these critical transients are not very well tracked by Kalman filtering. However, the proposed LES technique is found to be much faster in tracking these changes. Therefore, an appropriate combination of the LES and modified Kalman filtering is proposed in Chapter 6. Also, this time the ability of the proposed algorithm is verified on the real data obtained from a prototype test object. Chapter 7 proposes the other algorithm based on the concept of Kalman filtering similar to those of Chapter 3 and 6. However, this time an optimal digital filter is designed instead of the simple summation FIR filter. New initial settings for the modified Kalman filter are calculated based on the coefficients of the digital filter applied. Also, the ability of the proposed algorithm is verified on the real data obtained from a prototype test object. Chapter 8 uses the estimation algorithm proposed in Chapter 7 for the interconnection scheme of a DG to power network. Robust estimates of the signal amplitudes and phase angles obtained by the estimation approach are used in the reference generation of the compensation scheme. Several simulation tests provided in this chapter show that the proposed scheme can very well handle the source and load unbalance, load non-linearity, interharmonic distortion, supply voltage distortion, and synchronism of generated currents or voltages. The purposed compensation scheme also prevents distortion in voltage at the point of common coupling in weak source cases, balances the source currents, and makes the supply side power factor a desired value.
Resumo:
Google, Facebook, Twitter, LinkedIn, etc. are some of the prominent large-scale digital service providers that are having tremendous impact on societies, corporations and individuals. However, despite the rapid uptake and their obvious influence on the behavior of individuals and the business models and networks of organizations, we still lack a deeper, theory-guided understanding of the related phenomenon. We use Teece’s notion of complementary assets and extend it towards ‘digital complementary assets’ (DCA) in an attempt to provide such a theory-guided understanding of these digital services. Building on Teece’s theory, we make three contributions. First, we offer a new conceptualization of digital complementary assets in the form of digital public goods and digital public assets. Second, we differentiate three models for how organizations can engage with such digital complementary assets. Third, user-base is found to be a critical factor when considering appropriability.
Resumo:
Language-use has proven to be the most complex and complicating of all Internet features, yet people and institutions invest enormously in language and crosslanguage features because they are fundamental to the success of the Internet’s past, present and future. The thesis takes into focus the developments of the latter – features that facilitate and signify linking between or across languages – both in their historical and current contexts. In the theoretical analysis, the conceptual platform of inter-language linking is developed to both accommodate efforts towards a new social complexity model for the co-evolution of languages and language content, as well as to create an open analytical space for language and cross-language related features of the Internet and beyond. The practiced uses of inter-language linking have changed over the last decades. Before and during the first years of the WWW, mechanisms of inter-language linking were at best important elements used to create new institutional or content arrangements, but on a large scale they were just insignificant. This has changed with the emergence of the WWW and its development into a web in which content in different languages co-evolve. The thesis traces the inter-language linking mechanisms that facilitated these dynamic changes by analysing what these linking mechanisms are, how their historical as well as current contexts can be understood and what kinds of cultural-economic innovation they enable and impede. The study discusses this alongside four empirical cases of bilingual or multilingual media use, ranging from television and web services for languages of smaller populations, to large-scale, multiple languages involving web ventures by the British Broadcasting Corporation, the Special Broadcasting Service Australia, Wikipedia and Google. To sum up, the thesis introduces the concepts of ‘inter-language linking’ and the ‘lateral web’ to model the social complexity and co-evolution of languages online. The resulting model reconsiders existing social complexity models in that it is the first that can explain the emergence of large-scale, networked co-evolution of languages and language content facilitated by the Internet and the WWW. Finally, the thesis argues that the Internet enables an open space for language and crosslanguage related features and investigates how far this process is facilitated by (1) amateurs and (2) human-algorithmic interaction cultures.
Resumo:
This paper presents two case studies of marginalised youth experimenting with digital music production in flexible education settings. The cases were drawn from a three-year study of alternative assessment in flexible learning centres that enrol 650+ students who have left formal schooling in Queensland, Australia. The cases are framed in reference to the literature on cultural studies approaches to education and the digital arts. Each case describes the student’s histories, cultural background and experiences, music productions, evidence of learning and re-engagement with education. Findings document how digital music production can re-engage and extend participation among students who have left formal education. They do so by theorising the online judgements and blog comments about the digital music production as a social field of exchange. It also raises critical questions about the adequacy of current approaches to evaluating and accounting for the learning and development of such youth, especially where this has occurred through creative arts and digital production.
Resumo:
What opportunities does a channel like Twitter offer to libraries, beyond the realm of marketing? We would like to highlight three roles for Twitter in the academic library environment: Twitter as a service delivery and service recovery channel; Twitter as a community builder; Twitter as a site for information experience.
Resumo:
This working paper reflects upon the opportunities and challenges of designing a form of digital noticeboard system with a remote Aboriginal community that supports their aspirations for both internal and external communication. The project itself has evolved from a relationship built through ecological work between scientists and the local community on the Groote Eylandt archipelago to study native populations of animal species over the long term. In the course of this work the aspiration has emerged to explore how digital noticeboards might support communication on the island and externally. This paper introduces the community, the context and the history of the project. We then reflect upon the science project, its outcomes and a framework empowering the Aboriginal viewpoint, in order to draw lessons for extending what we see as a pragmatic and relationship based approach towards cross-cultural design.
Resumo:
Network RTK (Real-Time Kinematic) is a technology that is based on GPS (Global Positioning System) or more generally on GNSS (Global Navigation Satellite System) observations to achieve centimeter-level accuracy positioning in real time. It is enabled by a network of Continuously Operating Reference Stations (CORS). CORS placement is an important problem in the design of network RTK as it directly affects not only the installation and running costs of the network RTK, but also the Quality of Service (QoS) provided by the network RTK. In our preliminary research on the CORS placement, we proposed a polynomial heuristic algorithm for a so-called location-based CORS placement problem. From a computational point of view, the location-based CORS placement is a largescale combinatorial optimization problem. Thus, although the heuristic algorithm is efficient in computation time it may not be able to find an optimal or near optimal solution. Aiming at improving the quality of solutions, this paper proposes a repairing genetic algorithm (RGA) for the location-based CORS placement problem. The RGA has been implemented and compared to the heuristic algorithm by experiments. Experimental results have shown that the RGA produces better quality of solutions than the heuristic algorithm.
Resumo:
With the growth in number and sophistication of services widely available, there is a new urgency for comprehensive service descriptions that take into account both technical and business aspects. The last years have seen a number of efforts for best-of-breed service description focusing on specific aspects of services. The Handbook of Service Description provides the most advanced state of the art insights into these. The main parts of the book provide the most detailed documentation of the Unified Service Description Language (USDL) to date. USDL has been developed across several research institutes and publicly funded projects across Europe and Australia, currently extending to the Americas as part of a standardization push through W3C. The scope of services extends across IT and business, i.e., the socio-technical sense of services scaled to business networks. In this respect, purely human, purely automated and mixed human/automated services were considered, that have a boundary of cognizance that is available through the tasks of service provisioning, discovery, access and delivery. Taken together, the Handbook of Service Description provides a comprehensive reference suitable for a wide-reaching audience including researchers, practitioners, managers, and students who aspire to learn about or to create a deeper scientific foundation for service description and its methodological aspects.
Resumo:
The development of public service broadcasters (PSBs) in the 20th century was framed around debates about its difference compared to commercial broadcasting. These debates navigated between two poles. One concerned the relationship between non‐commercial sources of funding and the role played by statutory Charters as guarantors of the independence of PSBs. The other concerned the relationship between PSBs being both a complementary and a comprehensive service, although there are tensions inherent in this duality. In the 21st century, as reconfigured public service media organisations (PSMs) operate across multiple platforms in a convergent media environment, how are these debates changing, if at all? Is the case for PSM “exceptionalism” changed with Web‐based services, catch‐up TV, podcasting, ancillary product sales, and commissioning of programs from external sources in order to operate in highly diversified cross‐media environments? Do the traditional assumptions about non‐commercialism still hold as the basis for different forms of PSM governance and accountability? This paper will consider the question of PSM exceptionalism in the context of three reviews into Australian media that took place over 2011‐2012: the Convergence Review undertaken through the Department of Broadband, Communications and the Digital Economy; the National Classification Scheme Review undertaken by the Australian Law Reform Commission; and the Independent Media Inquiry that considered the future of news and journalism.
Resumo:
Historically, the public service broadcaster (PSB) acted beyond its institutional broadcasting remit by initiating and facilitating activities to support cultural infrastructure and national identity (Wilson, Hutchinson and Shea 2010). The recent focus to develop new content delivery platforms and services (Debrett 2010) signifies a semantic shift from the PSB to the public service media (PSM) organisation. The Australian PSM organisation, the Australian Broadcasting Corporation (ABC) has moved beyond the era of ‘online’ publishing to incorporate Web 2.0 technologies to foster new relationships with the audience (Walker 2009) and engage in production activities with participatory cultures (Jenkins 2006). This shift presents opportunities and challenges to traditional media production, the existing editorial policies and governance models, and raises questions around the value of PSM experimental and innovative activities. Further, the incorporation of information communication technologies and participatory cultures challenge the core values of ‘public service’ within PSM. This paper examines ABC Pool (abc.net.au/pool) as a means of extending the ABC’s public service remit by incorporating participatory cultures into the production and governance models of the corporation and critically analyses the public value of such innovative experiments.
Resumo:
Web 2.0 technologies have mobilised collaborative peer production and participatory cultures for online content creation. However, not all online communities engaging in these activities are independently facilitated and often operate within the auspices of the cultural institutions that develop and resource them. Borrowing from the principles of Wikipedia that supports collaborative online content creation and online community, ABC Pool (abc.net.au/pool) is one such institutional online community operating with the support of the Australian Public Service Broadcaster (PSB), the Australian Broadcasting Corporation (ABC). This paper explores the collaborative, creative, and governance activities of an institutional online community and how the role of the community manager is an intermediary within these arrangements.
Resumo:
Organizations make increasingly use of social media in order to compete for customer awareness and improve the quality of their goods and services. Multiple techniques of social media analysis are already in use. Nevertheless, theoretical underpinnings and a sound research agenda are still unavailable in this field at the present time. In order to contribute to setting up such an agenda, we introduce digital social signal processing (DSSP) as a new research stream in IS that requires multi-facetted investigations. Our DSSP concept is founded upon a set of four sequential activities: sensing digital social signals that are emitted by individuals on social media; decoding online data of social media in order to reconstruct digital social signals; matching the signals with consumers’ life events; and configuring individualized goods and service offerings tailored to the individual needs of customers. We further contribute to tying loose ends of different research areas together, in order to frame DSSP as a field for further investigation. We conclude with developing a research agenda.
Resumo:
There is a song at the beginning of the musical, West Side Story, where the character Tony sings that “something’s coming, something good.” The song is an anthem of optimism, brimming with promise. This paper is about the long-held promise of information and communication technology (ICT) to transform teaching and learning, to modernise the learning environment of the classroom, and to create a new digital pedagogy. But much of our experience to date in the schooling sector tells more of resistance and reaction than revolution, of more of the same but with a computer in the corner and of ICT activities as unwelcome time-fillers/time-wasters. Recently, a group of pre-service teachers in a postgraduate primary education degree in an Australian university were introduced to learning objects in an ICT immersion program. Their analyses and related responses, as recorded in online journals, have here been interpreted in terms of TPACK (Technological Pedagogical and Content Knowledge). Against contemporary observation, these students generally displayed high levels of competence and highly positive dispositions of students to the integration of ICT in their future classrooms. In short, they displayed the same optimism and confidence as the fictional “Tony” in believing that something good was coming.