597 resultados para Information Interaction
Resumo:
Experimental / pilot online journalistic publication. EUAustralia Online (www.euaustralia.com) is a pilot niche publication identifying and demonstrating dynamics of online journalism. The editor, an experienced and senior journalist and academic, specialist in European studies, commenced publication on 28.8.06 during one year’s “industry immersion” -- with media accreditation to the European Commission, Brussels. Reporting now is from Australia and from Europe on field trip exercises. Student editors participate making it partly a training operation. EUAustralia demonstrates adaptation of conventional, universal, “Western” liberal journalistic practices. Its first premise is to fill a knowledge gap in Australia about the European Union -- institutions, functions and directions. The second premise is to test the communications capacity of the online format, where the publication sets a strong standard of journalistic credibility – hence its transparency with sourcing or signposting of “commentary” or ”opinion”. EUAustralia uses modified, enhanced weblog software allowing for future allocation of closed pages to subscribers. An early exemplar of its kind, with modest upload rate (2010-13 average, 16 postings monthly), esteemed, it commands over 180000 site visits p.a. (half as unique visitors; AWB Statistics); strongly rated by search engines, see page one Googlr placements for “EU Australia”. Comment by the ISP (SeventhVision, Broadbeach, Queensland): “The site has good search engine recognition because seen as credible; can be used to generate revenue”. This journalistic exercise has been analysed in theoretical context twice, in published refereed conference proceedings (Communication and Media Policy Forum, Sydney; 2007, 2009).
Resumo:
Researching administrative history is problematical. A trail of authoritative documents is often hard to find; and useful summaries can be difficult to organise, especially if source material is in paper formats in geographically dispersed locations. In the absence of documents, the reasons for particular decisions and the rationale underpinning particular policies can be confounded as key personnel advance in their professions and retire. The rationale for past decisions may be lost for practical purposes; and if an organisation’s memory of events is diminished, its learning through experience is also diminished. Publishing this document tries to avoid unnecessary duplication of effort by other researchers that need to venture into how policies of charging for public sector information have been justified. The author compiled this work within a somewhat limited time period and the work does not pretend to be a complete or comprehensive analysis of the issues.----- A significant part of the role of government is to provide a framework of legally-enforceable rights and obligations that can support individuals and non-government organisations in their lawful activities. Accordingly, claims that governments should be more ‘business-like’ need careful scrutiny. A significant supply of goods and services occurs as non-market activity where neither benefits nor costs are quantified within conventional accounting systems or in terms of money. Where a government decides to provide information as a service; and information from land registries is archetypical, the transactions occur as a political decision made under a direct or a clearly delegated authority of a parliament with the requisite constitutional powers. This is not a market transaction and the language of the market confuses attempts to describe a number of aspects of how governments allocate resources.----- Cost recovery can be construed as an aspect of taxation that is a sole prerogative of a parliament. The issues are fundamental to political constitutions; but they become more complicated where states cede some taxing powers to a central government as part of a federal system. Nor should the absence of markets be construed necessarily as ‘market failure’ or even ‘government failure’. The absence is often attributable to particular technical, economic and political constraints that preclude the operation of markets. Arguably, greater care is needed in distinguishing between the polity and markets in raising revenues and allocating resources; and that needs to start by removing unhelpful references to ‘business’ in the context of government decision-making.
Resumo:
This study explores strategic decision-making (SDM) in micro-firms, an economically significant business subsector. As extant large- and small-firm literature currently proffers an incomplete characterization of SDM in very small enterprises, a multiple-case methodology was used to investigate how these firms make strategic decisions. Eleven Australian Information Technology service micro-firms participated in the study. Using an information-processing lens, the study uncovered patterns of SDM in micro-firms and derived a theoretical micro-firm SDM model. This research also identifies several implications for micro-firm management and directions for future research, contributing to the understanding of micro-firm SDM in both theory and practice.
Resumo:
The impact of technology on the health and well-being of workers has been a topic of interest since computers and computerized technology were widely introduced in the 1980s. Of recent concern is the impact of rapid technological advances on individuals’ psychological well-being, especially due to advancements in mobile technology which have increased many workers’ accessibility and expected productivity. In this chapter we focus on the associations between occupational stress and technology, especially behavioral and psychological reactions. We discuss some key facilitators and barriers associated with users’ acceptance of and engagement with information and communication technology. We conclude with recommendations for ongoing research on managing occupational health and well-being in conjunction with technological advancements.
What are students' understandings of how digital tools contribute to learning in design disciplines?
Resumo:
Building Information Modelling (BIM) is evolving in the Construction Industry as a successor to CAD. CAD is mostly a technical tool that conforms to existing industry practices, however BIM has the capacity to revolutionise industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team, facilitating collaboration and allowing experimentation in design. Exposing design students to this technology through their formal studies allows them to engage with cutting edge industry practices and to help shape the industry upon their graduation. Since this technology is relatively new to the construction industry, there are no accepted models for how to “teach” BIM effectively at university level. Developing learning models to enable students to make the most out of their learning with BIM presents significant challenges to those teaching in the field of design. To date there are also no studies of students experiences of using this technology. This research reports on the introduction of Building Information Modeling (BIM) software into a second year Bachelor of Design course. This software has the potential to change industry standards through its ability to revolutionise the work practices of those involved in large scale design projects. Students’ understandings and experiences of using the software in order to complete design projects as part of their assessment are reported here. In depth semi-structured interviews with 6 students revealed that students had views that ranged from novice to sophisticate about the software. They had variations in understanding of how the software could be used to complete course requirements, to assist with the design process and in the workplace. They had engaged in limited exploration of the collaborative potential of the software as a design tool. Their understanding of the significance of BIM for the workplace was also variable. The results indicate that students are beginning to develop an appreciation for how BIM could aid or constrain the work of designers, but that this appreciation is highly varied and likely to be dependent on the students’ previous experiences of working in a design studio environment. Their range of understandings of the significance of the technology is a reflection of their level of development as designers (they are “novice” designers). The results also indicate that there is a need for subjects in later years of the course that allow students to specialise in the area of digital design and to develop more sophisticated views of the role of technology in the design process. There is also a need to capitalise on the collaborative potential inherent in the software in order to realise its capability to streamline some aspects of the design process. As students become more sophisticated designers we should explore their understanding of the role of technology as a design tool in more depth in order to make recommendations for improvements to teaching and learning practice related to BIM and other digital design tools.
Resumo:
This paper explores how mobile games can transform everyday places into dynamic learning spaces filled with information and inspiration. It discusses the motivation inherent in playing games and creating games for others, and how this stimulates an iterative process of creation and reflection and evokes a natural desire to engage in learning. The use of MiLK at the Adelaide Botanic Gardens is offered as a case in point. MiLK is an authoring tool that allows students and teachers to create and share SMS games for mobile phones. A group of South Australian high school students used MiLK to play a game, create their own games and play each other’s games during a day at the gardens. This paper details the learning processes involved in these activities and how the students reflected on their learning, conducted peer assessment, and engaged in a two-way discussion with their teacher about new technologies and their implications for learning. The paper concludes with a discussion of the needs and requirements of 21st Century learners and how MiLK can support constructivist and connectivist teaching methods that engage learners and may produce an appropriately skilled future workforce.
Resumo:
Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.
Resumo:
In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.
Resumo:
Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.
Resumo:
The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of climate change on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since most of building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. In this paper, the methods used to prepare future weather data for the study of the impact of climate change are reviewed. The advantages and disadvantages of each method are discussed. The inherent relationship between these methods is also illustrated. Based on these discussions and the analysis of Australian historic climatic data, an effective framework and procedure to generate future hourly weather data is presented. It is shown that this method is not only able to deal with different levels of available information regarding the climate change, but also can retain the key characters of a “typical” year weather data for a desired period.
Resumo:
One of role of the nurse in the clinical setting is that of coordinating communication across the healthcare team. On a daily basis nurses interact with the person receiving care, their family members, and multiple care providers thus placing the nurse in the central position with access to a vast array of information on the person. Through this nurses have historically functioned as “information repositories”. With the advent of Health Information Technology (HIT) tools there is a potential that HIT could impact interdisciplinary communication, practice efficiency and effectiveness, relationships and workflow in acute care settings \[1]\[3]. In 2005, the HIMSS Nursing Informatics Community developed the IHITScale to measure the impact of HIT on the nursing role and interdisciplinary communication in USA hospitals. In 2007, nursing informatics colleagues from Australia, Finland, Ireland, New Zealand, Scotland and the USA formed a research collaborative to validate the IHIT in six additional countries. This paper will discuss the background, methodology, results and implications from the Australian IHIT survey of over 1100 nurses. The results are currently being analyzed and will be presented at the conference.
Resumo:
In 2005, the Healthcare Information Management Systems Society (HIMSS) Nursing Informatics Community developed a survey to measure the impact of health information technology (HIT), the IHIT Scale, on the role of nurses and interdisciplinary communication in hospital settings. In 2007, nursing informatics colleagues from Australia, England, Finland, Ireland, New Zealand, Scotland and the United States formed a research collaborative to validate the IHIT across countries. All teams have completed construct and face validation in their countries. Five out of six teams have initiated reliability testing by practicing nurses. This paper reports the international collaborative’s validation of the IHIT Scale completed to date.
Resumo:
We examine the nature and extent of statutory executive stock option (ESO) disclosures by Australian listed companies over the 2001 to 2004 period, and the influence of corporate governance mechanisms on these disclosures. Our results show a progressive increase in overall compliance from 2001 to 2004. However, despite the improved compliance, the results reveal managements’ continued reluctance to disclose more sensitive ESO information. Factors associated with good internal governance, including board independence, audit committee independence and effectiveness, and compensation committee independence and effectiveness are found to contribute to improved compliance. Similarly, certain external governance factors are associated with improved disclosure, including external auditor quality, shareholder activism (as proxied by companies identified as poor performers by the Australian Shareholders’ Association), and regulatory intervention.
Resumo:
Principal Topic: ''In less than ten years music labels will not exist anymore.'' Michael Smelli, former Global COO Sony/BMG MCA/QUT IMP Business Lab Digital Music Think Thanks 9 May 2009, Brisbane Big music labels such as EMI, Sony BMG and UMG have been responsible for promoting and producing a myriad of stars in the music industry over the last decades. However, the industry structure is under enormous threat with the emergence of a new innovative era of digital music. Recent years have seen a dramatic shift in industry power with the emergence of Napster and other file sharing sites, iTunes and other online stores, iPod and the MP3 revolution. Myspace.com and other social networking sites are connecting entrepreneurial artists with fans and creating online music communities independent of music labels. In 2008 the digital music business internationally grew by around 25% to 3.7 Billion US-Dollar. Digital platforms now account for around 20% of recorded music sales, up from 15 % in 2007 (IFPI Digital music report 2009). CD sales have fallen by 40% since their peak levels. Global digital music sales totalled an estimated US$ 3 Billion in 2007, an increase of 40% on 2006 figures. Digital sales account for an estimated 15% of global market, up from 11% in 2006 and zero in 2003. The music industry is more advanced in terms of digital revenues than any other creative or entertainment industry (except games). Its digital share is more than twice that of newspapers (7%), films (35) or books (2%). All these shifts present new possibilities for music entrepreneurs to act entrepreneurially and promote their music independently of the major music labels. Diffusion of innovations has a long tradition in both sociology (e.g. Rogers 1962, 2003) and marketing (Bass 1969, Mahajan et al., 1990). The context of the current project is theoretically interesting in two respects. First, the role of online social networks replaces traditional face-to-face word of mouth communications. Second, as music is a hedonistic product, this strongly influences the nature of interpersonal communications and their diffusion patterns. Both of these have received very little attention in the diffusion literature to date, and no studies have investigated the influence of both simultaneously. This research project is concerned with the role of social networks in this new music industry landscape, and how this may be leveraged by musicians willing to act entrepreneurially. Our key research question we intend to address is: How do online social network communities impact the nature, pattern and speed that music diffuses? Methodology/Key Propositions : We expect the nature/ character of diffusion of popular, generic music genres to be different from specialized, niche music. To date, only Moe & Fader (2002) and Lee et al. (2003) investigated diffusion patterns of music and these focus on forecast weekly sales of music CDs based on the advance purchase orders before the launch, rather than taking a detailed look at diffusion patterns. Consequently, our first research questions are concerned with understanding the nature of online communications within the context of diffusion of music and artists. Hence, we have the following research questions: RQ1: What is the nature of fan-to-fan ''word of mouth'' online communications for music? Do these vary by type of artist and genre of music? RQ2: What is the nature of artist-to-fan online communications for music? Do these vary by type of artist and genre of music? What types of communication are effective? Two outcomes from research social network theory are particularly relevant to understanding how music might diffuse through social networks. Weak tie theory (Granovetter, 1973), argues that casual or infrequent contacts within a social network (or weak ties) act as a link to unique information which is not normally contained within an entrepreneurs inner circle (or strong tie) social network. A related argument, structural hole theory (Burt, 1992), posits that it is the absence of direct links (or structural holes) between members of a social network which offers similar informational benefits. Although these two theories argue for the information benefits of casual linkages, and diversity within a social network, others acknowledge that a balanced network which consists of a mix of strong ties, weak ties is perhaps more important overall (Uzzi, 1996). It is anticipated that the network structure of the fan base for different types of artists and genres of music will vary considerably. This leads to our third research question: RQ3: How does the network structure of online social network communities impact the pattern and speed that music diffuses? The current paper is best described as theory elaboration. It will report the first exploratory phase designed to develop and elaborate relevant theory (the second phase will be a quantitative study of network structure and diffusion). We intend to develop specific research propositions or hypotheses from the above research questions. To do so we will conduct three focus group discussions of independent musicians and three focus group discussions of fans active in online music communication on social network sites. We will also conduct five case studies of bands that have successfully built fan bases through social networking sites (e.g. myspace.com, facebook.com). The idea is to identify which communication channels they employ and the characteristics of the fan interactions for different genres of music. We intend to conduct interviews with each of the artists and analyse their online interaction with their fans. Results and Implications : At the current stage, we have just begun to conduct focus group discussions. An analysis of the themes from these focus groups will enable us to further refine our research questions into testable hypotheses. Ultimately, our research will provide a better understanding of how social networks promote the diffusion of music, and how this varies for different genres of music. Hence, some music entrepreneurs will be able to promote their music more effectively. The results may be further generalised to other industries where online peer-to-peer communication is common, such as other forms of entertainment and consumer technologies.
Resumo:
Process modeling is a complex organizational task that requires many iterations and communication between the business analysts and the domain specialists involved in the process modeling. The challenge of process modeling is exacerbated, when the process of modeling has to be performed in a cross-organizational, distributed environment. Some systems have been developed to support collaborative process modeling, all of which use traditional 2D interfaces. We present an environment for collaborative process modeling, using 3D virtual environment technology. We make use of avatar instantiations of user ego centres, to allow for the spatial embodiment of the user with reference to the process model. We describe an innovative prototype collaborative process modeling approach, implemented as a modeling environment in Second Life. This approach leverages the use of virtual environments to provide user context for editing and collaborative exercises. We present a positive preliminary report on a case study, in which a test group modelled a business process using the system in Second Life.