459 resultados para Defining Entertainment
Resumo:
Winds of Change is a short film that contrasts the Tragic Tale of Surachai by tracing the life of a rural Australian man who is left a quadriplegic after a near fatal motorcycle accident. The film highlights issues confronting people living with a disability and its impact on family. This film was developed to raise discussion amongst students studying social work and human services at Queensland University of Technology.
Resumo:
Spudmonkey is an Australian feature film about a pizza delivery boy who achieves his dream of drumming in a successful rock band, only to be replaced by computerised drums. Genre: comedy Exclusive cinema release on October 30th, 2008 at the Blueroom Cinebar, Rosalie, Queensland. Spudmonkey can now be viewed online: http://www.youtube.com/watch?v=YD7RpryDxBI
Resumo:
Process mining techniques are able to extract knowledge from event logs commonly available in today’s information systems. These techniques provide new means to discover, monitor, and improve processes in a variety of application domains. There are two main drivers for the growing interest in process mining. On the one hand, more and more events are being recorded, thus, providing detailed information about the history of processes. On the other hand, there is a need to improve and support business processes in competitive and rapidly changing environments. This manifesto is created by the IEEE Task Force on Process Mining and aims to promote the topic of process mining. Moreover, by defining a set of guiding principles and listing important challenges, this manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users. The goal is to increase the maturity of process mining as a new tool to improve the (re)design, control, and support of operational business processes.
Resumo:
This project involved the complete refurbishment and extension of a 1980’s two-storey domestic brick building, previously used as a Boarding House (Class 3), into Middle School facilities (Class 9b) on a heritage listed site at Nudgee College secondary school, Brisbane. The building now accommodates 12 technologically advanced classrooms, computer lab and learning support rooms, tuckshop, art room, mini library/reading/stage area, dedicated work areas for science and large projects with access to water on both floors, staff facilities and an undercover play area suitable for assemblies and presentations. The project was based on a Reggio Emilia approach, in which the organisation of the physical environment is referred to as the child’s third teacher, creating opportunities for complex, varied, sustained and changing relationships between people and ideas. Classrooms open to a communal centre piazza and are integrated with the rest of the school and the school with the surrounding community. In order to achieve this linkage of the building with the overall masterplan of the site, a key strategy of the internal planning was to orientate teaching areas around a well defined active circulation space that breaks out of the building form to legibly define the new access points to the building and connect up to the pathway network of the campus. The width of the building allowed for classrooms and a generous corridor that has become ‘breakout’ teaching areas for art, IT, and small group activities. Large sliding glass walls allow teachers to maintain supervision of students across all areas and allow maximum light penetration through small domestic window openings into the deep and low-height spaces. The building was also designed with an effort to uphold cultural characteristics from the Edmund Rice Education Charter (2004). Coherent planning is accompanied by a quality fit-out, creating a vibrant and memorable environment in which to deliver the upper primary curriculum. Consistent with the Reggio Emilia approach, materials, expressive of the school’s colours, are used in a contemporary, adventurous manner to create panels of colour useful for massing and defining the ‘breakout’ teaching areas and paths of travel, and storage elements are detailed and arranged to draw attention to their aesthetic features. Modifications were difficult due to the random placement of load bearing walls, minimum ceiling heights, the general standard of finishes and new fire and energy requirements, however the reuse of this building was assessed to be up to 30% cheaper than an equivalent new building, The fit out integrates information technology and services at a level not usually found in primary school facilities. This has been achieved within the existing building fabric through thoughtful detailing and co-ordination with allied disciplines.
Resumo:
Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.
Resumo:
This paper deals with the development of ‘art clusters’ and their relocation in the city of Shanghai. It first looks at the revival of the city’s old inner city industrial area (along banks of Suzhou River) through ‘organic’ or ‘alternative’ artist-led cultural production; second, it describes the impact on these activities of the industrial restructuring of the wider city, reliant on large-scale real estate development, business services and global finance; and finally, outlines the relocation of these arts (and related) cultural industries to dispersed CBD locations as a result of those spatial, industrial and policy changes.
Resumo:
Ultrafine particles (UFPs, <100 nm) are produced in large quantities by vehicular combustion and are implicated in causing several adverse human health effects. Recent work has suggested that a large proportion of daily UFP exposure may occur during commuting. However, the determinants, variability and transport mode-dependence of such exposure are not well-understood. The aim of this review was to address these knowledge gaps by distilling the results of ‘in-transit’ UFP exposure studies performed to-date, including studies of health effects. We identified 47 exposure studies performed across 6 transport modes: automobile, bicycle, bus, ferry, rail and walking. These encompassed approximately 3000 individual trips where UFP concentrations were measured. After weighting mean UFP concentrations by the number of trips in which they were collected, we found overall mean UFP concentrations of 3.4, 4.2, 4.5, 4.7, 4.9 and 5.7 × 10^4 particles cm^-3 for the bicycle, bus, automobile, rail, walking and ferry modes, respectively. The mean concentration inside automobiles travelling through tunnels was 3.0 × 10^5 particles cm^-3. While the mean concentrations were indicative of general trends, we found that the determinants of exposure (meteorology, traffic parameters, route, fuel type, exhaust treatment technologies, cabin ventilation, filtration, deposition, UFP penetration) exhibited marked variability and mode-dependence, such that it is not necessarily appropriate to rank modes in order of exposure without detailed consideration of these factors. Ten in-transit health effects studies have been conducted and their results indicate that UFP exposure during commuting can elicit acute effects in both healthy and health-compromised individuals. We suggest that future work should focus on further defining the contribution of in-transit UFP exposure to total UFP exposure, exploring its specific health effects and investigating exposures in the developing world. Keywords: air pollution; transport modes; acute health effects; travel; public transport
Resumo:
“Turtle Twilight” is a two-screen video installation. Paragraphs of text adapted from a travel blog type across the left-hand screen. A computer-generated image of a tropical sunset is slowly animated on the right-hand screen. The two screens are accompanied by an atmospheric stock music track. This work examines how we construct, represent and deploy ‘nature’ in our contemporary lives. It mixes cinematic codes with image, text and sound gleaned from online sources. By extending on Nicolas Bourriad’s understanding of ‘postproduction’ and the creative and critical strategies of ‘editing’, it questions the relationship between contemporary screen culture, nature, desire and contemplation.
Resumo:
The design of artificial intelligence in computer games is an important component of a player's game play experience. As games are becoming more life-like and interactive, the need for more realistic game AI will increase. This is particularly the case with respect to AI that simulates how human players act, behave and make decisions. The purpose of this research is to establish a model of player-like behavior that may be effectively used to inform the design of artificial intelligence to more accurately mimic a player's decision making process. The research uses a qualitative analysis of player opinions and reactions while playing a first person shooter video game, with recordings of their in game actions, speech and facial characteristics. The initial studies provide player data that has been used to design a model of how a player behaves.
Resumo:
For people with intellectual disabilities there are significant barriers to inclusion in socially cooperative endeavours. This paper investigates the effectiveness of Stomp, a tangible user interface (TUI) designed to provide new participatory experiences for people with intellectual disability. Results from an observational study reveal the extent to which the Stomp system supports social and physical interaction. The tangible, spatial and embodied qualities of Stomp result in an experience that does not rely on the acquisition of specific competencies before interaction and engagement can occur.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
There is an intimate interconnectivity between policy guidelines defining reform and the delineation of what research methods would be subsequently applied to determine reform success. Research is guided as much by the metaphors describing it as by the ensuing empirical definition of actions of results obtained from it. In a call for different reform policy metaphors Lumby and English (2010) note, “The primary responsibility for the parlous state of education... lies with the policy makers that have racked our schools with reductive and dehumanizing processes, following the metaphors of market efficiency, and leadership models based on accounting and the characteristics of machine bureaucracy” (p. 127)
Resumo:
With estimates that two billion of the world’s population will be 65 years or older by 2050, ensuring that older people ‘age well’ is an international priority. To date, however, there is significant disagreement and debate about how to define and measure ‘ageing well’, with no consensus on either terminology or measurement. Thus, this chapter describes the research rationale, methodology and findings of the Australian Active Ageing Study (Triple A Study), which surveyed 2620 older Australians to identify significant contributions to quality of life for older people: work, learning, social participation, spirituality, emotional wellbeing, health, and life events. Exploratory factor analyses identified eight distinct elements (grouped into four key concepts) which appear to define ‘active ageing’ and explained 55% of the variance: social and life participation (25%), emotional health (22%), physical health and functioning (4%) and security (4%). These findings highlight the importance of understanding and supporting the social and emotional dimensions of ageing, as issues of social relationships, life engagement and emotional health dominated the factor structure. Our intension is that this paper will prompt informed debate and discussion on defining and measuring active ageing, facilitating exploration and understanding of the complexity of issues that intertwine, converge and enhance the ageing experience.
Resumo:
A broad range of positions is articulated in the academic literature around the relationship between recordings and live performance. Auslander (2008) argues that “live performance ceased long ago to be the primary experience of popular music, with the result that most live performances of popular music now seek to replicate the music on the recording”. Elliott (1995) suggests that “hit songs are often conceived and produced as unambiguous and meticulously recorded performances that their originators often duplicate exactly in live performances”. Wurtzler (1992) argues that “as socially and historically produced, the categories of the live and the recorded are defined in a mutually exclusive relationship, in that the notion of the live is premised on the absence of recording and the defining fact of the recorded is the absence of the live”. Yet many artists perform in ways that fundamentally challenge such positions. Whilst it is common practice for musicians across many musical genres to compose and construct their musical works in the studio such that the recording is, in Auslander’s words, the ‘original performance’, the live version is not simply an attempt to replicate the recorded version. Indeed in some cases, such replication is impossible. There are well known historical examples. Queen, for example, never performed the a cappella sections of Bohemian Rhapsody because it they were too complex to perform live. A 1966 recording of the Beach Boys studio creation Good Vibrations shows them struggling through the song prior to its release. This paper argues that as technology develops, the lines between the recording studio and live performance change and become more blurred. New models for performance emerge. In a 2010 live performance given by Grammy Award winning artist Imogen Heap in New York, the artist undertakes a live, improvised construction of a piece as a performative act. She invites the audience to choose the key for the track and proceeds to layer up the various parts in front of the audience as a live performance act. Her recording process is thus revealed on stage in real time and she performs a process that what would have once been confined to the recording studio. So how do artists bring studio production processes into the live context? What aspects of studio production are now performable and what consistent models can be identified amongst the various approaches now seen? This paper will present an overview of approaches to performative realisations of studio produced tracks and will illuminate some emerging relationships between recorded music and performance across a range of contexts.
Resumo:
In this paper, we examine the use of a Kalman filter to aid in the mission planning process for autonomous gliders. Given a set of waypoints defining the planned mission and a prediction of the ocean currents from a regional ocean model, we present an approach to determine the best, constant, time interval at which the glider should surface to maintain a prescribed tracking error, and minimizing time on the ocean surface. We assume basic parameters for the execution of a given mission, and provide the results of the Kalman filter mission planning approach. These results are compared with previous executions of the given mission scenario.