938 resultados para Information Interaction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is evolving in the Construction Industry as a successor to CAD. CAD is mostly a technical tool that conforms to existing industry practices, however BIM has the capacity to revolutionise industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team, facilitating collaboration and allowing experimentation in design. Exposing design students to this technology through their formal studies allows them to engage with cutting edge industry practices and to help shape the industry upon their graduation. Since this technology is relatively new to the construction industry, there are no accepted models for how to “teach” BIM effectively at university level. Developing learning models to enable students to make the most out of their learning with BIM presents significant challenges to those teaching in the field of design. To date there are also no studies of students experiences of using this technology. This research reports on the introduction of Building Information Modeling (BIM) software into a second year Bachelor of Design course. This software has the potential to change industry standards through its ability to revolutionise the work practices of those involved in large scale design projects. Students’ understandings and experiences of using the software in order to complete design projects as part of their assessment are reported here. In depth semi-structured interviews with 6 students revealed that students had views that ranged from novice to sophisticate about the software. They had variations in understanding of how the software could be used to complete course requirements, to assist with the design process and in the workplace. They had engaged in limited exploration of the collaborative potential of the software as a design tool. Their understanding of the significance of BIM for the workplace was also variable. The results indicate that students are beginning to develop an appreciation for how BIM could aid or constrain the work of designers, but that this appreciation is highly varied and likely to be dependent on the students’ previous experiences of working in a design studio environment. Their range of understandings of the significance of the technology is a reflection of their level of development as designers (they are “novice” designers). The results also indicate that there is a need for subjects in later years of the course that allow students to specialise in the area of digital design and to develop more sophisticated views of the role of technology in the design process. There is also a need to capitalise on the collaborative potential inherent in the software in order to realise its capability to streamline some aspects of the design process. As students become more sophisticated designers we should explore their understanding of the role of technology as a design tool in more depth in order to make recommendations for improvements to teaching and learning practice related to BIM and other digital design tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores how mobile games can transform everyday places into dynamic learning spaces filled with information and inspiration. It discusses the motivation inherent in playing games and creating games for others, and how this stimulates an iterative process of creation and reflection and evokes a natural desire to engage in learning. The use of MiLK at the Adelaide Botanic Gardens is offered as a case in point. MiLK is an authoring tool that allows students and teachers to create and share SMS games for mobile phones. A group of South Australian high school students used MiLK to play a game, create their own games and play each other’s games during a day at the gardens. This paper details the learning processes involved in these activities and how the students reflected on their learning, conducted peer assessment, and engaged in a two-way discussion with their teacher about new technologies and their implications for learning. The paper concludes with a discussion of the needs and requirements of 21st Century learners and how MiLK can support constructivist and connectivist teaching methods that engage learners and may produce an appropriately skilled future workforce.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of climate change on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since most of building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. In this paper, the methods used to prepare future weather data for the study of the impact of climate change are reviewed. The advantages and disadvantages of each method are discussed. The inherent relationship between these methods is also illustrated. Based on these discussions and the analysis of Australian historic climatic data, an effective framework and procedure to generate future hourly weather data is presented. It is shown that this method is not only able to deal with different levels of available information regarding the climate change, but also can retain the key characters of a “typical” year weather data for a desired period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of role of the nurse in the clinical setting is that of coordinating communication across the healthcare team. On a daily basis nurses interact with the person receiving care, their family members, and multiple care providers thus placing the nurse in the central position with access to a vast array of information on the person. Through this nurses have historically functioned as “information repositories”. With the advent of Health Information Technology (HIT) tools there is a potential that HIT could impact interdisciplinary communication, practice efficiency and effectiveness, relationships and workflow in acute care settings \[1]\[3]. In 2005, the HIMSS Nursing Informatics Community developed the IHITScale to measure the impact of HIT on the nursing role and interdisciplinary communication in USA hospitals. In 2007, nursing informatics colleagues from Australia, Finland, Ireland, New Zealand, Scotland and the USA formed a research collaborative to validate the IHIT in six additional countries. This paper will discuss the background, methodology, results and implications from the Australian IHIT survey of over 1100 nurses. The results are currently being analyzed and will be presented at the conference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2005, the Healthcare Information Management Systems Society (HIMSS) Nursing Informatics Community developed a survey to measure the impact of health information technology (HIT), the IHIT Scale, on the role of nurses and interdisciplinary communication in hospital settings. In 2007, nursing informatics colleagues from Australia, England, Finland, Ireland, New Zealand, Scotland and the United States formed a research collaborative to validate the IHIT across countries. All teams have completed construct and face validation in their countries. Five out of six teams have initiated reliability testing by practicing nurses. This paper reports the international collaborative’s validation of the IHIT Scale completed to date.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the nature and extent of statutory executive stock option (ESO) disclosures by Australian listed companies over the 2001 to 2004 period, and the influence of corporate governance mechanisms on these disclosures. Our results show a progressive increase in overall compliance from 2001 to 2004. However, despite the improved compliance, the results reveal managements’ continued reluctance to disclose more sensitive ESO information. Factors associated with good internal governance, including board independence, audit committee independence and effectiveness, and compensation committee independence and effectiveness are found to contribute to improved compliance. Similarly, certain external governance factors are associated with improved disclosure, including external auditor quality, shareholder activism (as proxied by companies identified as poor performers by the Australian Shareholders’ Association), and regulatory intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Principal Topic: ''In less than ten years music labels will not exist anymore.'' Michael Smelli, former Global COO Sony/BMG MCA/QUT IMP Business Lab Digital Music Think Thanks 9 May 2009, Brisbane Big music labels such as EMI, Sony BMG and UMG have been responsible for promoting and producing a myriad of stars in the music industry over the last decades. However, the industry structure is under enormous threat with the emergence of a new innovative era of digital music. Recent years have seen a dramatic shift in industry power with the emergence of Napster and other file sharing sites, iTunes and other online stores, iPod and the MP3 revolution. Myspace.com and other social networking sites are connecting entrepreneurial artists with fans and creating online music communities independent of music labels. In 2008 the digital music business internationally grew by around 25% to 3.7 Billion US-Dollar. Digital platforms now account for around 20% of recorded music sales, up from 15 % in 2007 (IFPI Digital music report 2009). CD sales have fallen by 40% since their peak levels. Global digital music sales totalled an estimated US$ 3 Billion in 2007, an increase of 40% on 2006 figures. Digital sales account for an estimated 15% of global market, up from 11% in 2006 and zero in 2003. The music industry is more advanced in terms of digital revenues than any other creative or entertainment industry (except games). Its digital share is more than twice that of newspapers (7%), films (35) or books (2%). All these shifts present new possibilities for music entrepreneurs to act entrepreneurially and promote their music independently of the major music labels. Diffusion of innovations has a long tradition in both sociology (e.g. Rogers 1962, 2003) and marketing (Bass 1969, Mahajan et al., 1990). The context of the current project is theoretically interesting in two respects. First, the role of online social networks replaces traditional face-to-face word of mouth communications. Second, as music is a hedonistic product, this strongly influences the nature of interpersonal communications and their diffusion patterns. Both of these have received very little attention in the diffusion literature to date, and no studies have investigated the influence of both simultaneously. This research project is concerned with the role of social networks in this new music industry landscape, and how this may be leveraged by musicians willing to act entrepreneurially. Our key research question we intend to address is: How do online social network communities impact the nature, pattern and speed that music diffuses? Methodology/Key Propositions : We expect the nature/ character of diffusion of popular, generic music genres to be different from specialized, niche music. To date, only Moe & Fader (2002) and Lee et al. (2003) investigated diffusion patterns of music and these focus on forecast weekly sales of music CDs based on the advance purchase orders before the launch, rather than taking a detailed look at diffusion patterns. Consequently, our first research questions are concerned with understanding the nature of online communications within the context of diffusion of music and artists. Hence, we have the following research questions: RQ1: What is the nature of fan-to-fan ''word of mouth'' online communications for music? Do these vary by type of artist and genre of music? RQ2: What is the nature of artist-to-fan online communications for music? Do these vary by type of artist and genre of music? What types of communication are effective? Two outcomes from research social network theory are particularly relevant to understanding how music might diffuse through social networks. Weak tie theory (Granovetter, 1973), argues that casual or infrequent contacts within a social network (or weak ties) act as a link to unique information which is not normally contained within an entrepreneurs inner circle (or strong tie) social network. A related argument, structural hole theory (Burt, 1992), posits that it is the absence of direct links (or structural holes) between members of a social network which offers similar informational benefits. Although these two theories argue for the information benefits of casual linkages, and diversity within a social network, others acknowledge that a balanced network which consists of a mix of strong ties, weak ties is perhaps more important overall (Uzzi, 1996). It is anticipated that the network structure of the fan base for different types of artists and genres of music will vary considerably. This leads to our third research question: RQ3: How does the network structure of online social network communities impact the pattern and speed that music diffuses? The current paper is best described as theory elaboration. It will report the first exploratory phase designed to develop and elaborate relevant theory (the second phase will be a quantitative study of network structure and diffusion). We intend to develop specific research propositions or hypotheses from the above research questions. To do so we will conduct three focus group discussions of independent musicians and three focus group discussions of fans active in online music communication on social network sites. We will also conduct five case studies of bands that have successfully built fan bases through social networking sites (e.g. myspace.com, facebook.com). The idea is to identify which communication channels they employ and the characteristics of the fan interactions for different genres of music. We intend to conduct interviews with each of the artists and analyse their online interaction with their fans. Results and Implications : At the current stage, we have just begun to conduct focus group discussions. An analysis of the themes from these focus groups will enable us to further refine our research questions into testable hypotheses. Ultimately, our research will provide a better understanding of how social networks promote the diffusion of music, and how this varies for different genres of music. Hence, some music entrepreneurs will be able to promote their music more effectively. The results may be further generalised to other industries where online peer-to-peer communication is common, such as other forms of entertainment and consumer technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process modeling is a complex organizational task that requires many iterations and communication between the business analysts and the domain specialists involved in the process modeling. The challenge of process modeling is exacerbated, when the process of modeling has to be performed in a cross-organizational, distributed environment. Some systems have been developed to support collaborative process modeling, all of which use traditional 2D interfaces. We present an environment for collaborative process modeling, using 3D virtual environment technology. We make use of avatar instantiations of user ego centres, to allow for the spatial embodiment of the user with reference to the process model. We describe an innovative prototype collaborative process modeling approach, implemented as a modeling environment in Second Life. This approach leverages the use of virtual environments to provide user context for editing and collaborative exercises. We present a positive preliminary report on a case study, in which a test group modelled a business process using the system in Second Life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the role of intuition in the way that people operate unfamiliar devices. Intuition is a type of cognitive processing that is often non-conscious and utilises stored experiential knowledge. Intuitive interaction involves the use of knowledge gained from other products and/or experiences. Two initial experimental studies revealed that prior exposure to products employing similar features helped participants to complete set tasks more quickly and intuitively, and that familiar features were intuitively used more often than unfamiliar ones. A third experiment confirmed that performance is affected by a person's level of familiarity with similar technologies, and also revealed that appearance (shape, size and labelling of features) seems to be the variable that most affects time spent on a task and intuitive uses during that time. Age also seems to have an effect. These results and their implications are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The field was the design of cross-cultural media art exhibition outcomes for the Japanese marketplace. The context was improved understandings of spatial, temporal and contextual exhibition design procedures as they ultimately impact upon the augmentation of cross-cultural understanding. The research investigated cross-cultural new media exhibition practices suited to the specific sensitivies of Japanese exhibition practices. The methodology was principally practice-led. The research drew upon seven years of prior exhibition design practices in order to generate new Japanese exhibition design methodologies. It also empowered Zaim Artpsace’s Japanese curators to later present a range of substantial new media shows. The project also succeeded in developing new cross-cultural alliances that led to significant IDA projects in Beijing, Australia and Europe in the years 2008-10. Through invitations from external curators the new versions of the exhibition work subsequently travelled to 4 other major venues including the prestigious Songzhang Art Museum, Beijing in 07/08, the Block, QUT, Brisbane and the Tokyo International Film festival. Inspiration Art Press printed a major catalogue for the event extensively featuring this exhibition. This project also led to the Sudamalis (2007) paper, ‘Building Capacity: Literacy And Creative Workforce Development Through International Digital Arts Projects’ (IDAprojects) Exhibition Programs And Partnerships’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Market failures involving the sale of complex merchandise, such as residential property, financial products and credit, have principally been attributed to information asymmetries. Existing legislative and regulatory responses were developed having regard to consumer protection policies based on traditional economic theories that focus on the notion of the ‘rational consumer’. Governmental responses therefore seek to impose disclosure obligations on sellers of complex goods or products to ensure that consumers have sufficient information upon which to make a decision. Emergent research, based on behavioural economics, challenges traditional ideas and instead focuses on the actual behaviour of consumers. This approach suggests that consumers as a whole do not necessarily benefit from mandatory disclosure because some, if not most, consumers do not pay attention to the disclosed information before they make a decision to purchase. The need for consumer policies to take consumer characteristics and behaviour into account is being increasingly recognised by governments, and most recently in the policy framework suggested by the Australian Productivity Commission

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The challenge for all educators is to fuse the learning of information literacy to an academic education in such a way that the outcome is systematic and sustainable learning for students. This challenge can be answered through long-term commitment to information literacy education bound to organisation-wide, renewable strategic planning and driven through systemic reform. This chapter seeks to explore the two sides of reforming information literacy education in an academic environment. Specifically, it will examine how one Australian university has undertaken the implementation of a rigorous strategic, systemic approach to information literacy learning and teaching.