934 resultados para School sites.
Resumo:
The 2 hour game jam was performed as part of the State Library of Queensland 'Garage Gamer' series of events, summer 2013, at the SLQ exhibition. An aspect of the exhibition was the series of 'Level Up' game nights. We hosted the first of these - under the auspices of brIGDA, Game On. It was a party - but the focal point of the event was a live streamed 2 hour game jam. Game jams have become popular amongst the game development and design community in recent years, particularly with the growth of the Global Game Jam, a yearly event which brings thousands of game makers together across different sites in different countries. Other established jams take place on-line, for example the Ludum Dare challenge which as been running since 2002. Other challenges follow the same model in more intimate circumstances and it is now common to find institutions and groups holding their own small local game making jams. There are variations around the format, some jams are more competitive than others for example, but a common aspect is the creation of an intense creative crucible centred around team work and ‘accelerated game development’. Works (games) produced during these intense events often display more experimental qualities than those undertaken as commercial projects. In part this is because the typical jam is started with a conceptual design brief, perhaps a single word, or in the case of the specific game jam described in this paper, three words. Teams have to envision the challenge key word/s as a game design using whatever skills and technologies they can and produce a finished working game in the time given. Game jams thus provide design researchers with extraordinary fodder and recent years have also seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). The 2 hour game jam held during the SLQ Garage Gamer program was all about social experience.
Resumo:
While over the past decade many Australian schools have come to understand the transformative potential of digitally-rich teaching and learning, traditional models of schooling continue to dominate. Even with significant investment in the area, both in terms of digital resourcing and teacher professional development, innovation has generally only occurred in individual classrooms or ‘pockets’ in schools. This article discusses three interdependent conditions which need to exist as a foundation in order to facilitate the opportunity for transformation from traditional to digitally-rich ways of working in primary, middle and secondary schools or colleges. Distributed and transformational leadership approaches are critiqued with core elements identified which facilitate change. The establishment of a vision is identified and discussed as a fundamental driver and rudder for school transformation. The importance of creating and maintaining urgency to compel a school community to adopt and embed change is unpacked. This report concludes with a synthesis of the three preconditions and recommendations for proponents of digital school transformation.
Resumo:
In this chapter we will describe the contemporary variety of practice-oriented training institutions in Australia. We will examine the different ways in which public and private providers are managing the challenges of digitization and convergence. We will consider the logics governing film education this mix of providers pulls into focus, and we will outline some of the challenges providers face in educating, (re)training, and preparing their graduates for life and work beyond the film school. These challenges highlight questions about the accountabilities and responsibilities of practice-oriented film education institutions. This chapter begins with an introductory section that outlines these logics and questions. It explores some of the tensions and dynamics within the spectrum of issues through which we can understand film schools. The chapter then goes on briefly to describe the multifaceted training landscape in Australia, before profiling the leading public provider, the Australian Film, Television and Radio School (AFTRS), as well as the other leading public providers the Victorian College of the Arts, and the Griffith Film School. It concludes with a short section on film education in primary and secondary schools as the education sector prepares for the implementation of a national curriculum in which ‘media arts’ has a new centrality.
Resumo:
Providing an incentive is becoming common practice among blood service organisations. Driven by self-orientated motives rather than pure philanthropic intentions, research is showing that people increasingly want something in return for their support. It is contended that individuals donate conspicuously with the hope it will improve their social standing. Yet there is limited evidence for the effectiveness of conspicuous recognition strategies, and no studies, to the researcher’s knowledge, that have examined conspicuous donation strategies in an online social media context. There is a need to understand what value drives individuals to donate blood, and whether conspicuous donation strategies are a source of such value post blood donation. The purpose of this paper is to conceptualise how conspicuous donation strategies, in the form of virtual badges on social media sites, can be applied to the social behaviour of blood donation, as a value-adding tool, to encourage repeat behaviour.
Resumo:
The world’s increasing complexity, competitiveness, interconnectivity, and dependence on technology generate new challenges for nations and individuals that cannot be met by continuing education as usual. With the proliferation of complex systems have come new technologies for communication, collaboration, and conceptualisation. These technologies have led to signifi cant changes in the forms of mathematical and scientifi c thinking required beyond the classroom. Modelling, in its various forms, can develop and broaden students’ mathematical and scientific thinking beyond the standard curriculum. This chapter first considers future competencies in the mathematical sciences within an increasingly complex world. Consideration is then given to interdisciplinary problem solving and models and modelling, as one means of addressing these competencies. Illustrative case studies involving complex, interdisciplinary modelling activities in Years 1 and 7 are presented.
Resumo:
Throughout Australia (and in comparable urban contexts around the world) public spaces may be said to be under attack by developers and also attempts by civic authorities to regulate, restrict, rebrand and reframe them. A consequence of the increasingly security driven, privatised and surveilled nature of public space is the exclusion and displacement of those considered flawed and unwelcome in the ‘spectacular’ consumption spaces of many major urban centres. In the name of urban regeneration, processes of securitisation, ‘gentrification’ and creative cities discourses can refashion public space as sites of selective inclusion and exclusion. In this context of monitoring and control procedures, children and young people’s use of space in parks, neighbourhoods, shopping malls and streets is often viewed as a threat to the social order, requiring various forms of punitive and/or remedial action. This paper discusses developments in the surveillance, governance and control of public space used by children and young people in particular and the capacity for their displacement and marginality, diminishing their sense of place and belonging, and right to public space as an expression of their civil, political and social citizenship(s).
Resumo:
The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.
Resumo:
Introduction In January 2013, clinicians in Honiara, Solomon Islands noted several patients presenting with dengue-like illness. Serum from three cases tested positive for dengue by rapid diagnostic test. Subsequent increases in cases were reported, and the outbreak was confirmed as being dengue serotype-3 by further laboratory tests. This report describes the ongoing outbreak investigation, findings and response. Methods Enhanced dengue surveillance was implemented in the capital, Honiara, and in the provinces. This included training health staff on dengue case definitions, data collection and reporting. Vector surveillance was also conducted. Results From 3 January to 15 May 2013, 5254 cases of suspected dengue were reported (101.8 per 10 000 population), including 401 hospitalizations and six deaths. The median age of cases was 20 years (range zero to 90), and 86% were reported from Honiara. Both Aedes aegyti and Aedes albopictus were identified in Honiara. Outbreak response measures included clinical training seminars, vector control activities, implementation of diagnostic and case management protocols and a public communication campaign. Discussion This was the first large dengue outbreak documented in Solomon Islands. Factors that may have contributed to this outbreak include a largely susceptible population, the presence of a highly efficient dengue vector in Honiara, a high-density human population with numerous breeding sites and favourable weather conditions for mosquito proliferation. Although the number of cases has plateaued since 1 April, continued enhanced nationwide surveillance and response activities are necessary.
Predicting invasion in grassland ecosystems: is exotic dominance the real embarrassment of richness?
Resumo:
Invasions have increased the size of regional species pools, but are typically assumed to reduce native diversity. However, global-scale tests of this assumption have been elusive because of the focus on exotic species richness, rather than relative abundance. This is problematic because low invader richness can indicate invasion resistance by the native community or, alternatively, dominance by a single exotic species. Here, we used a globally replicated study to quantify relationships between exotic richness and abundance in grass-dominated ecosystems in 13 countries on six continents, ranging from salt marshes to alpine tundra. We tested effects of human land use, native community diversity, herbivore pressure, and nutrient limitation on exotic plant dominance. Despite its widespread use, exotic richness was a poor proxy for exotic dominance at low exotic richness, because sites that contained few exotic species ranged from relatively pristine (low exotic richness and cover) to almost completely exotic-dominated ones (low exotic richness but high exotic cover). Both exotic cover and richness were predicted by native plant diversity (native grass richness) and land use (distance to cultivation). Although climate was important for predicting both exotic cover and richness, climatic factors predicting cover (precipitation variability) differed from those predicting richness (maximum temperature and mean temperature in the wettest quarter). Herbivory and nutrient limitation did not predict exotic richness or cover. Exotic dominance was greatest in areas with low native grass richness at the site- or regional-scale. Although this could reflect native grass displacement, a lack of biotic resistance is a more likely explanation, given that grasses comprise the most aggressive invaders. These findings underscore the need to move beyond richness as a surrogate for the extent of invasion, because this metric confounds monodominance with invasion resistance. Monitoring species' relative abundance will more rapidly advance our understanding of invasions.
Resumo:
There has been considerable scientific interest in personal exposure to ultrafine particles (UFP). In this study, the inhaled particle surface area doses and dose relative intensities in the tracheobronchial and alveolar regions of lungs were calculated using the measured 24-hour UFP time series of school children personal exposures for each recorded activity. Bayesian hierarchical modelling was used to determine mean doses and dose intensities for the various microenvironments. Analysis of measured personal exposures for 137 participating children from 25 schools in the Brisbane Metropolitan Area showed similar trends for all the participating children. Bayesian regression modelling was performed to calculate the daily proportion of children's total doses at different microenvironments. The proportion of alveolar doses in the total daily dose for \emph{home}, \emph{school}, \emph{commuting} and \emph{other} were 55.3\%, 35.3\%, 4.5\% and 5.0\%, respectively, with the \emph{home} microenvironment contributing a majority of children's total daily dose. Children's mean indoor dose was never higher than the outdoor's at any of the schools, indicating there were no persistent indoor particle sources in the classrooms during the measurements. Outdoor activities, eating/cooking at home and commuting were the three activities with the highest dose intensities. Personal exposure was more influenced by the ambient particle levels than immediate traffic.
Resumo:
High-risk adolescents are shown to jeopardise their future social and health functioning as well as placing themselves and others at immediate risk of harm. The challenge of “reaching” high-risk adolescents, who are often marginalised, is considerable. There is a positive relationship between age and risk taking behaviors during adolescence. This study examines outcomes (alcohol use, transport risk behaviors, violence) of a school based intervention (SPIY) by comparing low-medium risk adolescents with high-risk adolescents over a six month period.
Resumo:
Enterprise Social Networks continue to be adopted by organisations looking to increase collaboration between employees, customers and industry partners. Offering a varied range of features and functionality, this technology can be distinguished by the underlying business models that providers of this software deploy. This study identifies and describes the different business models through an analysis of leading Enterprise Social Networks: Yammer, Chatter, SharePoint, Connections, Jive, Facebook and Twitter. A key contribution of this research is the identification of consumer and corporate models as extreme approaches. These findings align well with research on the adoption of Enterprise Social Networks that has discussed bottom-up and top-down approaches. Of specific interest are hybrid models that wrap a corporate model within a consumer model and may, therefore, provide synergies on both models. From a broader perspective, this can be seen as the merging of the corporate and consumer markets for IT products and services.
Resumo:
Software as a Service (SaaS) is anticipated to provide significant benefits to small and medium enterprises (SMEs) due to ease of access to high-end applications, 7*24 availability, utility pricing, etc. However, underlying SaaS is the assumption that SMEs will directly interact with the SaaS vendor and use a self-service model. In practice, we see the rise of SaaS intermediaries who support SMEs with using SaaS. This paper reports on an empirical study of the role of intermediaries in terms of how they support SMEs in sourcing and leveraging SaaS for their business. The knowledge contributions of this paper are: (1) the identification and description of the role of SaaS intermediaries and (2) the specification of different roles of SaaS intermediaries, in particular a more basic role with technology orientation and operational alignment perspective and (3) a more added value role with customer orientation and strategic alignment perspective.
Resumo:
This paper investigates how Enterprise Architecture (EA) evolves due to emerging trends. It specifically explores how EA integrates the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The paper focuses on reasons for why EA evolution could take place, or not and what architectural changes could happen due to SOA integration. The research builds on sound theoretical foundations to discuss EA evolution in a field that often lacks a solid theoretical groundwork. Specifically, it proposes that critical realism, using the morphogenetic theory, can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. The initial results of a literature review (a-priori model) were extended using explorative interviews. The findings of this study are threefold. First, there are five different levels of EA-SOA integration outcomes. Second, a mature EA, flexible and well-defined EA framework and comprehensive objectives of EA improve the integration outcomes. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.
Resumo:
This paper proposes that critical realism can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. Specifically it will investigate the practically relevant and academically challenging question of how EAs integrate the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The focus lies on the reasons why EA evolution takes place (or not) and what architectural changes happen. This paper uses the findings of a literature review to build an a-priori model informed by Archer’s theory to understand EA evolution in a field that often lacks a solid theoretical groundwork. The findings are threefold. First, EA can evolve on different levels (different integration outcomes). Second, the integration outcomes are classified into three levels: business architecture, information systems architecture and technology architecture. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.