302 resultados para Data-communication
Resumo:
This paper presents a deterministic modelling approach to predict diffraction loss for an innovative Multi-User-Single-Antenna (MUSA) MIMO technology, proposed for rural Australian environments. In order to calculate diffraction loss, six receivers have been considered around an access point in a selected rural environment. Generated terrain profiles for six receivers are presented in this paper. Simulation results using classical diffraction models and diffraction theory are also presented by accounting the rural Australian terrain data. Results show that in an area of 900 m by 900 m surrounding the receivers, path loss due to diffraction can range between 5 dB and 35 dB. Diffraction loss maps can contribute to determine the optimal location for receivers of MUSA-MIMO systems in rural areas.
Resumo:
Strategic communication is held to be a key process by which organisations respond to environmental uncertainty. In the received view articulated in the literatures of organisational communication and public relations, strategic communication results from collaborative efforts by organisational members to create shared understanding about environmental uncertainty and, as a result of this collective understanding, formulate appropriate communication responses. In this study, I explore how such collaborative efforts towards the development of strategic communication are derived from, and bounded by, culturally shared values and assumptions. Study of the influences of an organisation‟s culture on the formulation of strategic communication is a fundamental conceptual challenge for public relations and, to date, a largely unaddressed area of research. This thesis responds to this challenge by describing a key property of organisational culture – the action of cultural selection (Durham, 1992). I integrate this property of cultural selection to extend and refine the descriptive range of Weick‟s (1969, 1979) classic sociocultural model of organizing. From this integration I propose a new model, the Cultural Selection of Strategic Communication (CSSC). Underpinning the CSSC model is the central proposition that because of the action of cultural selection during organizing processes, the inherently conservative properties of an organisation‟s culture constrain development of effective strategic communication in ways that may be unrelated to the outcomes of “environmental scanning” and other monitoring functions heralded by the public relations literature as central to organisational adaptation. Thus, by examining the development of strategic communication, I describe a central conservative influence on the social ecology of organisations. This research also responds to Butschi and Steyn‟s (2006) call for the development of theory focusing on strategic communication as well as Grunig (2006) and Sriramesh‟s (2007) call for research to further understand the role of culture in public relations practice. In keeping with the explorative and descriptive goals of this study, I employ organisational ethnography to examine the influence of cultural selection on the development of strategic communication. In this methodological approach, I use the technique of progressive contextualisation to compare data from two related but distinct cultural settings. This approach provides a range of descriptive opportunities to permit a deeper understanding of the work of cultural selection. Findings of this study propose that culture, operating as a system of shared and socially transmitted social knowledge, acts through the property of cultural selection to influence decision making, and decrease conceptual variation within a group. The findings support the view that strategic communication, as a cultural product derived from the influence of cultural selection, is an essential feature to understand the social ecology of an organisation.
Resumo:
This special issue of the Journal of Urban Technology brings together five articles that are based on presentations given at the Street Computing workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer-Human Interaction conference (OZCHI 2009). Our own article introduces the Street Computing vision and explores the potential, challenges and foundations of this research vision. In order to do so, we first look at the currently available sources of information and discuss their link to existing research efforts. Section 2 then introduces the notion of Street Computing and our research approach in more detail. Section 3 looks beyond the core concept itself and summarises related work in this field of interest.
Resumo:
Swelling social need and competing calls on government funds have heightened the philanthropic dollar’s value. Yet, Australia is not regarded as having a robust giving culture: while 86% of adults give, a mere 16% plan their giving with those who do donating four times as much as spontaneous givers (Giving Australia, 2005). Traditionally, the prime planned giving example is a charitable bequest, a revenue stream not prevalent here (Baker, 2007). In fact, Baker’s Victorian probate data shows under 5% of estates provide a charitable bequest and just over 1% of estate assets is bequeathed. The UK, in contrast, sources 30% and the US 10% of charitable income through bequests (NCVO, 2004; Sargeant, Wymer and Hilton,2006). Australian charities could boost bequest giving. Understanding the donor market, which has or may remember them in their will is critical. This paper reports donor perceptions of Australian charities’ bequest communication/ marketing. The data forms part of a wider study of Australian donors’ bequest attitudes and behaviour. Charities spend heavily on bequest promotion, from advertising to personal selling to public relations and promotion. Infrastructure funds are scarce so guidance on what works for donors is important. Guy and Patton (1988) made their classic call for a nonprofit marketing perspective and identify the need for charities to better understand the motivations and behaviour of their supporters. In similar vein, this study aims to improve the way nonprofits and givers interact; and ultimately, enhance the giving experience and thus multiply planned giving participation. Academically, it offers insights to Australian bequest motivations and attitudes not studied empirically before.
Resumo:
Cities accumulate and distribute vast sets of digital information. Many decision-making and planning processes in councils, local governments and organisations are based on both real-time and historical data. Until recently, only a small, carefully selected subset of this information has been released to the public – usually for specific purposes (e.g. train timetables, release of planning application through websites to name just a few). This situation is however changing rapidly. Regulatory frameworks, such as the Freedom of Information Legislation in the US, the UK, the European Union and many other countries guarantee public access to data held by the state. One of the results of this legislation and changing attitudes towards open data has been the widespread release of public information as part of recent Government 2.0 initiatives. This includes the creation of public data catalogues such as data.gov.au (U.S.), data.gov.uk (U.K.), data.gov.au (Australia) at federal government levels, and datasf.org (San Francisco) and data.london.gov.uk (London) at municipal levels. The release of this data has opened up the possibility of a wide range of future applications and services which are now the subject of intensified research efforts. Previous research endeavours have explored the creation of specialised tools to aid decision-making by urban citizens, councils and other stakeholders (Calabrese, Kloeckl & Ratti, 2008; Paulos, Honicky & Hooker, 2009). While these initiatives represent an important step towards open data, they too often result in mere collections of data repositories. Proprietary database formats and the lack of an open application programming interface (API) limit the full potential achievable by allowing these data sets to be cross-queried. Our research, presented in this paper, looks beyond the pure release of data. It is concerned with three essential questions: First, how can data from different sources be integrated into a consistent framework and made accessible? Second, how can ordinary citizens be supported in easily composing data from different sources in order to address their specific problems? Third, what are interfaces that make it easy for citizens to interact with data in an urban environment? How can data be accessed and collected?
Resumo:
Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.
Resumo:
During the course of several natural disasters in recent years, Twitter has been found to play an important role as an additional medium for many–to–many crisis communication. Emergency services are successfully using Twitter to inform the public about current developments, and are increasingly also attempting to source first–hand situational information from Twitter feeds (such as relevant hashtags). The further study of the uses of Twitter during natural disasters relies on the development of flexible and reliable research infrastructure for tracking and analysing Twitter feeds at scale and in close to real time, however. This article outlines two approaches to the development of such infrastructure: one which builds on the readily available open source platform yourTwapperkeeper to provide a low–cost, simple, and basic solution; and, one which establishes a more powerful and flexible framework by drawing on highly scaleable, state–of–the–art technology.
Resumo:
The encryption method is a well established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data. In this paper we review the conventional encryption method which can be partially queried and propose the encryption method for numerical data which can be effectively queried. The proposed system includes the design of the service scenario, and metadata.
Resumo:
A substantial body of literature exists identifying factors contributing to under-performing Enterprise Resource Planning systems (ERPs), including poor communication, lack of executive support and user dissatisfaction (Calisir et al., 2009). Of particular interest is Momoh et al.’s (2010) recent review identifying poor data quality (DQ) as one of nine critical factors associated with ERP failure. DQ is central to ERP operating processes, ERP facilitated decision-making and inter-organizational cooperation (Batini et al., 2009). Crucial in ERP contexts is that the integrated, automated, process driven nature of ERP data flows can amplify DQ issues, compounding minor errors as they flow through the system (Haug et al., 2009; Xu et al., 2002). However, the growing appreciation of the importance of DQ in determining ERP success lacks research addressing the relationship between stakeholders’ requirements and perceptions of ERP DQ, perceived data utility and the impact of users’ treatment of data on ERP outcomes.
Resumo:
Advances in information and communication technologies have brought about an information revolution, leading to fundamental changes in the way that information is collected or generated, shared and distributed. The importance of establishing systems in which research findings can be readily made available to and used by other researchers has long been recognized in international scientific collaborations. If the data access principles adopted by international scientific collaborations are to be effectively implemented they must be supported by the national policies and laws in place in the countries in which participating researchers are operating.
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches to the virtual machine placement problem consider the energy consumption by physical machines in a data center only, but do not consider the energy consumption in communication network in the data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement in order to make the data center more energy-efficient. In this paper, we propose a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both the servers and the communication network in the data center. Experimental results show that the genetic algorithm performs well when tackling test problems of different kinds, and scales up well when the problem size increases.
Resumo:
Recent management research has evidenced the significance of organizational social networks, and communication is believed to impact the interpersonal relationships. However, we have little knowledge on how communication affects organizational social networks. This paper studies the dynamics between organizational communication patterns and the growth of organizational social networks. We propose an organizational social network growth model, and then collect empirical data to test model validity. The simulation results agree well with the empirical data. The results of simulation experiments enrich our knowledge on communication with the findings that organizational management practices that discourage employees from communicating within and across group boundaries have disparate and significant negative effect on the social network’s density, scalar assortativity and discrete assortativity, each of which correlates with the organization’s performance. These findings also suggest concrete measures for management to construct and develop the organizational social network.
Resumo:
This article addresses the paucity of systematic data on graduate careers in the arts and humanities in the broader context of enduring public and policy debates about the benefits of education to society, the relation between public and private good that is derivable from education, and the specific disciplinary angle that can be brought to bear on these questions from media, cultural and communication studies. We report findings from a survey of ten years of graduates from Queensland University of Technology’s courses in media, cultural and communication studies which indicate very high employment levels and generally positive accounts of the relevance of courses to working life. A major insight that can be drawn from the research is that media, cultural and communication studies deliver capabilities, skills and orientations which are themselves strongly aligned with the kinds of transferable generic attributes which facilitate transition into the workplace.
Resumo:
This article investigates the role of information communication technologies (ICTs) in establishing a well-aligned, authentic learning environment for a diverse cohort of non-cognate and cognate students studying event management in a higher education context. Based on a case study which examined the way ICTs assisted in accommodating diverse learning needs, styles and stages in an event management subject offered in the Creative Industries Faculty at Queensland University of Technology in Brisbane, Australia, the article uses an action research approach to generate grounded, empirical data on the effectiveness of the dynamic, individualised curriculum frameworks that the use of ICTs makes possible. The study provides insights into the way non-cognate and cognate students respond to different learning tools. It finds that whilst non-cognate and cognate students do respond to learning tools differently, due to a differing degree of emphasis on technical, task or theoretical competencies, the use of ICTs allows all students to improve their performance by providing multiple points of entry into the content. In this respect, whilst the article focuses on the way ICTs can be used to develop an authentic, well-aligned curriculum model that meets the needs of event management students in a higher education context, with findings relevant for event educators in Business, Hospitality, Tourism and Creative Industries, the strategies outlined may also be useful for educators in other fields who are faced with similar challenges when designing and developing curriculum for diverse cohorts.
Resumo:
A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.