325 resultados para computer supported collaborative work


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a retrospective view of a game design practice that recently switched from the development of complex learning games to the development of simple authoring tools for students to design their own learning games for each other. We introduce how our ‘10% Rule’, a premise that only 10% of what is learnt during a game design process is ultimately appreciated by the player, became a major contributor to the evolving practice. We use this rule primarily as an analytical and illustrative tool to discuss the learning involved in designing and playing learning games rather than as a scientifically and empirically proven rule. The 10% rule was promoted by our experience as designers and allows us to explore the often overlooked and valuable learning processes involved in designing learning games and mobile games in particular. This discussion highlights that in designing mobile learning games, students are not only reflecting on their own learning processes through setting up structures for others to enquire and investigate, they are also engaging in high-levels of independent inquiry and critical analysis in authentic learning settings. We conclude the paper with a discussion of the importance of these types of learning processes and skills of enquiry in 21st Century learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2005, the Healthcare Information Management Systems Society (HIMSS) Nursing Informatics Community developed a survey to measure the impact of health information technology (HIT), the IHIT Scale, on the role of nurses and interdisciplinary communication in hospital settings. In 2007, nursing informatics colleagues from Australia, England, Finland, Ireland, New Zealand, Scotland and the United States formed a research collaborative to validate the IHIT across countries. All teams have completed construct and face validation in their countries. Five out of six teams have initiated reliability testing by practicing nurses. This paper reports the international collaborative’s validation of the IHIT Scale completed to date.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The studio-gameon event was supported by the Institute of the Creative Industries and Innovation and the Faculty of IT as part of the State Library of Queensland GAME ON exhibition (ex Barbican, UK) The studio produced a full game in six weeks. It was a curated event, a live web-based exhibition, a performance for the public and the team produced a digital / creative work which is available for download. The studio enabled a team of students to experience the pressures of a real game studio within the space of the precincts but also very much in the public eye. It was a physical hypothesis of the University's mantra - "for the real world" statement: Studio GameOn is an opportunity running alongside the GAME ON exhibition at the State Library of Queensland. The exhibition itself is open to the public from November 17th through to February 15th. The studio runs from January 5th to February 13th 2009. The Studio GameOn challenge? To put together a team of game developers and make a playable game in six weeks! The studio-game on team consists of a group of game developers in training - the team members are all students who are either half-way through or completing a qualification in game design and all its elements - we have designers, artists, programmers and productionteam members. We are also fortunate to have an Industry Board consisting of local Queensland Games professionals: John Passfield (Red Sprite Studios), Adrian Cook (WIldfire Studios) and Duncan Curtis and Marko Grgic (The 3 Blokes). We also invite the public to play with us - there is an ideas box both on-site at the State Library and a number of ways to communicate with us on this studio website.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since its inception at the NEXT LEVEL Festival in 2004. It is curated by Truna aka j. Turner and Lubi Thomas and sees teams of both future game makers and industry professionals going head to head under pressure to produce playable games within the time period. The 48 hour is supported by the International Game Developers Association (Brisbane Chapter)and the Creative Industries Precincts as part of their public programs. It is a curated event which engages industry with Brisbane educational institutes and which fosters the Australian Games Industry

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Community engagement is increasingly being employed by Australian organizations as a key strategy to incorporate representative community opinions into decision-making. This trend is reflected by Australian local and state governments legislating for community consultation in major infrastructure projects and the increasing role of public relations practitioners to manage these programs. This study explores community engagement founded on relational theory and proposed a typology of engagement employing a relational framework. An exploratory study of 20 Australian infrastructure projects with a mandatory consultation component is analyzed applying this framework. Results indicate little discrimination between the terms engagement, consultation, and participation; however, a range of tactics supported both collaborative and advocacy approaches. The implications for adopting a relational framework for community engagement programs are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis maps the author's journey from a music composition practice to a composition and performance practice. The work involves the development of a software library for the purpose of encapsulating compositional ideas in software, and realising these ideas in performance through a live coding computer music practice. The thesis examines what artistic practice emerges through live coding and software development, and does this permit a blurring between the activities of music composition and performance. The role that software design plays in affecting musical outcomes is considered to gain an insight into how software development contributes to artistic development. The relationship between music composition and performance is also examined to identify the means by which engaging in live coding and software development can bring these activities together. The thesis, situated within the discourse of practice led research, documents a journey which uses the experience of software development and performance as a means to guide the direction of the research. The journey serves as an experiment for the author in engaging an hitherto unfamiliar musical practice, and as a roadmap for others seeking to modify or broaden their artistic practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process modeling is a complex organizational task that requires many iterations and communication between the business analysts and the domain specialists involved in the process modeling. The challenge of process modeling is exacerbated, when the process of modeling has to be performed in a cross-organizational, distributed environment. Some systems have been developed to support collaborative process modeling, all of which use traditional 2D interfaces. We present an environment for collaborative process modeling, using 3D virtual environment technology. We make use of avatar instantiations of user ego centres, to allow for the spatial embodiment of the user with reference to the process model. We describe an innovative prototype collaborative process modeling approach, implemented as a modeling environment in Second Life. This approach leverages the use of virtual environments to provide user context for editing and collaborative exercises. We present a positive preliminary report on a case study, in which a test group modelled a business process using the system in Second Life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teaching and learning in working groups is a challenge to both teacher and student. Collaboration is an elusive concept, difficult to teach, impossible to enforce, yet the ability to work in this way is an essential characteristic of any Creative Industries professional. In 2006, a group of creative industries students were charged with the task of collaboratively creating a performance as part of their coursework. This research project closely followed their development as agents of collaborative creativity using an innovative methodology which combined performance and documentary making. The result is COLLABORATORY - a DVD documentary isolating key behaviours and features of collaborative learning: Convergence and divergence of ideas, Characters and behaviours in collaboration, Leadership and facilitation, Motivation and Intersubjectivity. An engaging presentation of theory and practice capturing what Meill and Littleton (2004) call “the emotional dance of collaboration”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) is a multidisciplinary university in Brisbane, Queensland, Australia, and has 40,000 students and 1,700 researchers. Notable eResearch infrastructure includes the QUT ePrints repository, Microsoft QUT Research Centre, the OAK (Open Access to Knowledge) Law Project, Cambia and leading research institutes. ---------- The Australian Government, via the Australian National Data Service (ANDS), is funding institutions to identify and describe their research datasets, to develop and populate data repositories and collaborative infrastructure, and to seed the Australian Research Data Commons. QUT is currently broadening its range of research support services, including those to support the management of research data, in recognition of the value of these datasets as products of the research process, and in order to maximize the potential for reuse. QUT is integrating Library and High Performance Computing (HPC) services to achieve its research support goals. ---------- The Library and HPC released an online survey using Key Survey to 1,700 researchers in September 2009. A comprehensive range of eResearch practices and skills was presented for response, and grouped into areas of scholarly communication and open access publishing, using collaborative technologies, data management, data collection and management, computation and visualization tools. Researchers were asked to rate their skill level on each practice. 254 responses were received over two weeks. Eight focus groups were also held with 35 higher degree research (HDR) students and staff to provide additional qualitative feedback. A similar survey was released to 100 support staff and 73 responses were received.---------- Preliminary results from the researcher survey and focus groups indicate a gap between current eResearch practices, and the potential for researchers to engage in eResearch practices. Researchers are more likely to seek advice from their peers, than from support staff. HDR students are more positive about eResearch practices and are more willing to learn new ways of conducting research. An account of the survey methodology, the results obtained, and proposed strategies to embed eResearch practices and skills across and within the research disciplines will be provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Denial of Services DDoS, attacks has become one of the biggest threats for resources over Internet. Purpose of these attacks is to make servers deny from providing services to legitimate users. These attacks are also used for occupying media bandwidth. Currently intrusion detection systems can just detect the attacks but cannot prevent / track the location of intruders. Some schemes also prevent the attacks by simply discarding attack packets, which saves victim from attack, but still network bandwidth is wasted. In our opinion, DDoS requires a distributed solution to save wastage of resources. The paper, presents a system that helps us not only in detecting such attacks but also helps in tracing and blocking (to save the bandwidth as well) the multiple intruders using Intelligent Software Agents. The system gives dynamic response and can be integrated with the existing network defense systems without disturbing existing Internet model. We have implemented an agent based networking monitoring system in this regard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The quality of stormwater runoff from ports is significant as it can be an important source of pollution to the marine environment. This is also a significant issue for the Port of Brisbane as it is located in an area of high environmental values. Therefore, it is imperative to develop an in-depth understanding of stormwater runoff quality to ensure that appropriate strategies are in place for quality improvement, where necessary. To this end, the Port of Brisbane Corporation aimed to develop a port specific stormwater model for the Fisherman Islands facility. The need has to be considered in the context of the proposed future developments of the Port area. ----------------- The Project: The research project is an outcome of the collaborative Partnership between the Port of Brisbane Corporation (POBC) and Queensland University of Technology (QUT). A key feature of this Partnership is that it seeks to undertake research to assist the Port in strengthening the environmental custodianship of the Port area through ‘cutting edge’ research and its translation into practical application. ------------------ The project was separated into two stages. The first stage developed a quantitative understanding of the generation potential of pollutant loads in the existing land uses. This knowledge was then used as input for the stormwater quality model developed in the subsequent stage. The aim is to expand this model across the yet to be developed port expansion area. This is in order to predict pollutant loads associated with stormwater flows from this area with the longer term objective of contributing to the development of ecological risk mitigation strategies for future expansion scenarios. ----------------- Study approach: Stage 1 of the overall study confirmed that Port land uses are unique in terms of the anthropogenic activities occurring on them. This uniqueness in land use results in distinctive stormwater quality characteristics different to other conventional urban land uses. Therefore, it was not scientifically valid to consider the Port as belonging to a single land use category or to consider as being similar to any typical urban land use. The approach adopted in this study was very different to conventional modelling studies where modelling parameters are developed using calibration. The field investigations undertaken in Stage 1 of the overall study helped to create fundamental knowledge on pollutant build-up and wash-off in different Port land uses. This knowledge was then used in computer modelling so that the specific characteristics of pollutant build-up and wash-off can be replicated. This meant that no calibration processes were involved due to the use of measured parameters for build-up and wash-off. ---------------- Conclusions: Stage 2 of the study was primarily undertaken using the SWMM stormwater quality model. It is a physically based model which replicates natural processes as closely as possible. The time step used and catchment variability considered was adequate to accommodate the temporal and spatial variability of input parameters and the parameters used in the modelling reflect the true nature of rainfall-runoff and pollutant processes to the best of currently available knowledge. In this study, the initial loss values adopted for the impervious surfaces are relatively high compared to values noted in research literature. However, given the scientifically valid approach used for the field investigations, it is appropriate to adopt the initial losses derived from this study for future modelling of Port land uses. The relatively high initial losses will reduce the runoff volume generated as well as the frequency of runoff events significantly. Apart from initial losses, most of the other parameters used in SWMM modelling are generic to most modelling studies. Development of parameters for MUSIC model source nodes was one of the primary objectives of this study. MUSIC, uses the mean and standard deviation of pollutant parameters based on a normal distribution. However, based on the values generated in this study, the variation of Event Mean Concentrations (EMCs) for Port land uses within the given investigation period does not fit a normal distribution. This is possibly due to the fact that only one specific location was considered, namely the Port of Brisbane unlike in the case of the MUSIC model where a range of areas with different geographic and climatic conditions were investigated. Consequently, the assumptions used in MUSIC are not totally applicable for the analysis of water quality in Port land uses. Therefore, in using the parameters included in this report for MUSIC modelling, it is important to note that it may result in under or over estimations of annual pollutant loads. It is recommended that the annual pollutant load values given in the report should be used as a guide to assess the accuracy of the modelling outcomes. A step by step guide for using the knowledge generated from this study for MUSIC modelling is given in Table 4.6. ------------------ Recommendations: The following recommendations are provided to further strengthen the cutting edge nature of the work undertaken: * It is important to further validate the approach recommended for stormwater quality modelling at the Port. Validation will require data collection in relation to rainfall, runoff and water quality from the selected Port land uses. Additionally, the recommended modelling approach could be applied to a soon-to-be-developed area to assess ‘before’ and ‘after’ scenarios. * In the modelling study, TSS was adopted as the surrogate parameter for other pollutants. This approach was based on other urban water quality research undertaken at QUT. The validity of this approach should be further assessed for Port land uses. * The adoption of TSS as a surrogate parameter for other pollutants and the confirmation that the <150 m particle size range was predominant in suspended solids for pollutant wash-off gives rise to a number of important considerations. The ability of the existing structural stormwater mitigation measures to remove the <150 m particle size range need to be assessed. The feasibility of introducing source control measures as opposed to end-of-pipe measures for stormwater quality improvement may also need to be considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer profiling is the automated forensic examination of a computer system in order to provide a human investigator with a characterisation of the activities that have taken place on that system. As part of this process, the logical components of the computer system – components such as users, files and applications - are enumerated and the relationships between them discovered and reported. This information is enriched with traces of historical activity drawn from system logs and from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work examines the impact of temporal inconsistency in such information and discusses two types of temporal inconsistency that may arise – inconsistency arising out of the normal errant behaviour of a computer system, and inconsistency arising out of deliberate tampering by a suspect – and techniques for dealing with inconsistencies of the latter kind. We examine the impact of deliberate tampering through experiments conducted with prototype computer profiling software. Based on the results of these experiments, we discuss techniques which can be employed in computer profiling to deal with such temporal inconsistencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue of cultural competency in health care continues to be a priority in Australia for health and human services professionals. Cultural competence in caring for Aboriginal and Torres Strait Islander peoples is of increasing interest, and is a priority in closing the gap in health disparities between Indigenous and non-Indigenous Australians. Through a collaborative conversation, the authors draw on a case study, personal experience and the literature to highlight some of the issues associated with employing culturally appropriate, culturally safe and culturally competent approaches when caring for Aboriginal and Torres Strait Islander peoples. The intent of this article is to encourage discussion on the topic of cultural competency, and to challenge health professionals and academics to think and act on racism, colonialism, historical circumstances and the political, social, economic, and geographical realms in which we live and work, and which all impact on cultural competency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reputation and proof-of-work systems have been outlined as methods bot masters will soon use to defend their peer-to-peer botnets. These techniques are designed to prevent sybil attacks, such as those that led to the downfall of the Storm botnet. To evaluate the effectiveness of these techniques, a botnet that employed these techniques was simulated, and the amount of resources required to stage a successful sybil attack against it measured. While the proof-of-work system was found to increase the resources required for a successful sybil attack, the reputation system was found to lower the amount of resources required to disable the botnet.