428 resultados para Networked Digital Environment
Resumo:
This article reports on a research program that has developed new methodologies for mapping the Australian blogosphere and tracking how information is disseminated across it. The authors improve on conventional web crawling methodologies in a number of significant ways: First, the authors track blogging activity as it occurs, by scraping new blog posts when such posts are announced through Really Simple Syndication (RSS) feeds. Second, the authors use custom-made tools that distinguish between the different types of content and thus allow us to analyze only the salient discursive content provided by bloggers. Finally, the authors are able to examine these better quality data using both link network mapping and textual analysis tools, to produce both cumulative longer term maps of interlinkages and themes, and specific shorter term snapshots of current activity that indicate current clusters of heavy interlinkage and highlight their key themes. In this article, the authors discuss findings from a yearlong observation of the Australian political blogosphere, suggesting that Australian political bloggers consistently address current affairs, but interpret them differently from mainstream news outlets. The article also discusses the next stage of the project, which extends this approach to an examination of other social networks used by Australians, including Twitter, YouTube, and Flickr. This adaptation of our methodology moves away from narrow models of political communication, and toward an investigation of everyday and popular communication, providing a more inclusive and detailed picture of the Australian networked public sphere.
Resumo:
Diffusion is the process that leads to the mixing of substances as a result of spontaneous and random thermal motion of individual atoms and molecules. It was first detected by the English botanist Robert Brown in 1827, and the phenomenon became known as ‘Brownian motion’. More specifically, the motion observed by Brown was translational diffusion – thermal motion resulting in random variations of the position of a molecule. This type of motion was given a correct theoretical interpretation in 1905 by Albert Einstein, who derived the relationship between temperature, the viscosity of the medium, the size of the diffusing molecule, and its diffusion coefficient. It is translational diffusion that is indirectly observed in MR diffusion-tensor imaging (DTI). The relationship obtained by Einstein provides the physical basis for using translational diffusion to probe the microscopic environment surrounding the molecule.
Resumo:
-
Resumo:
Mobile sensor platforms such as Autonomous Underwater Vehicles (AUVs) and robotic surface vessels, combined with static moored sensors compose a diverse sensor network that is able to provide macroscopic environmental analysis tool for ocean researchers. Working as a cohesive networked unit, the static buoys are always online, and provide insight as to the time and locations where a federated, mobile robot team should be deployed to effectively perform large scale spatiotemporal sampling on demand. Such a system can provide pertinent in situ measurements to marine biologists whom can then advise policy makers on critical environmental issues. This poster presents recent field deployment activity of AUVs demonstrating the effectiveness of our embedded communication network infrastructure throughout southern California coastal waters. We also report on progress towards real-time, web-streaming data from the multiple sampling locations and mobile sensor platforms. Static monitoring sites included in this presentation detail the network nodes positioned at Redondo Beach and Marina Del Ray. One of the deployed mobile sensors highlighted here are autonomous Slocum gliders. These nodes operate in the open ocean for periods as long as one month. The gliders are connected to the network via a Freewave radio modem network composed of multiple coastal base-stations. This increases the efficiency of deployment missions by reducing operational expenses via reduced reliability on satellite phones for communication, as well as increasing the rate and amount of data that can be transferred. Another mobile sensor platform presented in this study are the autonomous robotic boats. These platforms are utilized for harbor and littoral zone studies, and are capable of performing multi-robot coordination while observing known communication constraints. All of these pieces fit together to present an overview of ongoing collaborative work to develop an autonomous, region-wide, coastal environmental observation and monitoring sensor network.
Resumo:
Digital modelling tools are the next generation of computer aided design (CAD) tools for the construction industry. They allow a designer to build a virtual model of the building project before the building is constructed. This supports a whole range of analysis, and the identification and resolution of problems before they arise on-site, in ways that were previously not feasible.
Resumo:
Road surface macro-texture is an indicator used to determine the skid resistance levels in pavements. Existing methods of quantifying macro-texture include the sand patch test and the laser profilometer. These methods utilise the 3D information of the pavement surface to extract the average texture depth. Recently, interest in image processing techniques as a quantifier of macro-texture has arisen, mainly using the Fast Fourier Transform (FFT). This paper reviews the FFT method, and then proposes two new methods, one using the autocorrelation function and the other using wavelets. The methods are tested on pictures obtained from a pavement surface extending more than 2km's. About 200 images were acquired from the surface at approx. 10m intervals from a height 80cm above ground. The results obtained from image analysis methods using the FFT, the autocorrelation function and wavelets are compared with sensor measured texture depth (SMTD) data obtained from the same paved surface. The results indicate that coefficients of determination (R2) exceeding 0.8 are obtained when up to 10% of outliers are removed.
Resumo:
This paper describes a lead project currently underway through Australia’s Sustainable Built Environment National Research Centre, evaluating impacts, diffusion mechanisms and uptake of R&D in the Australian building and construction industry. Building on a retrospective analysis of R&D trends and industry outcomes, a future-focused industry roadmap will be developed to inform R&D policies more attuned to future industry needs to improve investment effectiveness. In particular, this research will evaluate national R&D efforts to develop, test and implement advanced digital modelling technologies into the design/construction/asset management cycle. This research will build new understandings and knowledge relevant to R&D funding strategies, research team formation and management (with involvement from public and private sectors, and research and knowledge institutions), dissemination of outcomes and uptake. This is critical due to the disaggregated nature of the industry, intense competition, limited R&D investment; and new challenges (e.g. digital modelling, integrated project delivery, and the demand for packaged services). The evaluation of leading Australian and international efforts to integrate advanced digital modelling technologies into the design/construction/asset management cycle will be undertaken as one of three case studies. Employing the recently released Australian Guidelines for Digital Modelling developed with buildingSMART (International Alliance for Interoperability) and the Australian Institute of Architects, technical and business benefits across the supply chain will be highlighted as drivers for more integrated R&D efforts.
Resumo:
Many governments world wide are attempting to increase accountability, transparency, and the quality of services by adopting information and communications technologies (ICTs) to modernize and change the way their administrations work. Meanwhile e-government is becoming a significant decision-making and service tool at local, regional and national government levels. The vast majority of users of these government online services see significant benefits from being able to access services online. The rapid pace of technological development has created increasingly more powerful ICTs that are capable of radically transforming public institutions and private organizations alike. These technologies have proven to be extraordinarily useful instruments in enabling governments to enhance the quality, speed of delivery and reliability of services to citizens and to business (VanderMeer & VanWinden, 2003). However, just because the technology is available does not mean it is accessible to all. The term digital divide has been used since the 1990s to describe patterns of unequal access to ICTs—primarily computers and the Internet—based on income, ethnicity, geography, age, and other factors. Over time it has evolved to more broadly define disparities in technology usage, resulting from a lack of access, skills, or interest in using technology. This article provides an overview of recent literature on e-government and the digital divide, and includes a discussion on the potential of e-government in addressing the digital divide.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
The emergence of mobile and ubiquitous computing has created what is referred to as a hybrid space – a virtual layer of digital information and interaction opportunities that sits on top and augments the physical environment. The increasing connectedness through such media, from anywhere to anybody at anytime, makes us less dependent on being physically present somewhere in particular. But, what is the role of ubiquitous computing in making physical presence at a particular place more attractive? Acknowledging historic context and identity as important attributes of place, this work embarks on a ‘global sense of place’ in which the cultural diversity, multiple identities, backgrounds, skills and experiences of people traversing a place are regarded as social assets of that place. The aim is to explore ways how physical architecture and infrastructure of a place can be mediated towards making invisible social assets visible, thus augmenting people’s situated social experience. Thereby, the focus is on embodied media, i.e. media that materialise digital information as observable and sometimes interactive parts of the physical environment hence amplify people’s real world experience, rather than substituting or moving it to virtual spaces.
Resumo:
Healthy and sustainable food is gaining more attention from consumers, industry, and researchers. Yet many approaches to date are limited to information dissemination, advertisement or education. We have embarked on a three year collaborative research project (2011 – 2013) to explore urban food practices – eating, cooking, growing food – to support the well-being of people and the environment. Our overall goal is to employ a user-centred interaction design research approach to inform the development of entertaining, real-time, mobile and networked applications, engaging playful feedback to build motivation. Our aspiration for this study is to deliver usable and useful mobile and situated interaction prototypes that employ individual and group strategies to foster food cultures that provide new pathways to produce, share and enjoy food that is green, healthy, and fun.
Resumo:
This chapter examines how a change in school leadership can successfully address competencies in complex situations and thus create a positive learning environment in which Indigenous students can excel in their learning rather than accept a culture that inhibits school improvement. Mathematics has long been an area that has failed to assist Indigenous students in improving their learning outcomes, as it is a Eurocentric subject (Rothbaum, Weisz, Pott, Miyake & Morelli, 2000, De Plevitz, 2007) and does not contextualize pedagogy with Indigenous culture and perspectives (Matthews, Cooper & Baturo, 2007). The chapter explores the work of a team of Indigenous and non-Indigenous academics from the YuMi Deadly Centre who are turning the tide on improving Indigenous mathematical outcomes in schools and in communities with high numbers of Aboriginal and Torres Strait Islander students.