182 resultados para Event-log animation
Resumo:
The emergence of global computer networks and the ubiquitous availability of advanced information and communication technology (ICT) since the mid Nineties has given rise to the hope that the traditional disadvantages faced by regional economies and regional communities could be elevated easily and swiftly. Yet, the experience of both community informatics and community development researchers and practitioners tells a different tale. Although the potential of ICT is in fact realised in some situations and locations and does provide means to ensure sustainability in some regional communities, elsewhere it has not been taken up or has not been able to elicit change for the promised better. Too many communities are still faced by a centralised structure in the context of commerce, service provision or governance and by various degrees of digital divides between connected and disconnected, between media literate and illiterate, between young and old, and between urban and rural. Many attempts to close or bridge the digital divide have been reported with various degrees of success (cf. Menou, 2001; Servon, 2002). Most of these accounts echo a common voice in that they report similar principles of action, and they reflect – in most cases unconsciously – practices of sociocultural animation. This article seeks to shed light onto the concept of sociocultural animation which is already commonplace in various forms in the arts, in education and professional development, youth work, sports, town planning, careers services, entrepreneurship and tourism. It starts by exploring the origins of sociocultural animation and draws parallels to the current state of research and practice. It unpacks the foundation of sociocultural animation and briefly describes underlying principles and how they can be applied in the context of community informatics and developing regional communities with ICT. Finally, further areas of investigation are being proposed.
Resumo:
Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.
Resumo:
This study uses borehole geophysical log data of sonic velocity and electrical resistivity to estimate permeability in sandstones in the northern Galilee Basin, Queensland. The prior estimates of permeability are calculated according to the deterministic log–log linear empirical correlations between electrical resistivity and measured permeability. Both negative and positive relationships are influenced by the clay content. The prior estimates of permeability are updated in a Bayesian framework for three boreholes using both the cokriging (CK) method and a normal linear regression (NLR) approach to infer the likelihood function. The results show that the mean permeability estimated from the CK-based Bayesian method is in better agreement with the measured permeability when a fairly apparent linear relationship exists between the logarithm of permeability and sonic velocity. In contrast, the NLR-based Bayesian approach gives better estimates of permeability for boreholes where no linear relationship exists between logarithm permeability and sonic velocity.
Resumo:
Atmospheric ultrafine particles play an important role in affecting human health, altering climate and degrading visibility. Numerous studies have been conducted to better understand the formation process of these particles, including field measurements, laboratory chamber studies and mathematical modeling approaches. Field studies on new particle formation found that formation processes were significantly affected by atmospheric conditions, such as the availability of particle precursors and meteorological conditions. However, those studies were mainly carried out in rural areas of the northern hemisphere and information on new particle formation in urban areas, especially those in subtropical regions, is limited. In general, subtropical regions display a higher level of solar radiation, along with stronger photochemical reactivity, than those regions investigated in previous studies. However, based on the results of these studies, the mechanisms involved in the new particle formation process remain unclear, particularly in the Southern Hemisphere. Therefore, in order to fill this gap in knowledge, a new particle formation study was conducted in a subtropical urban area in the Southern Hemisphere during 2009, which measured particle size distribution in different locations in Brisbane, Australia. Characterisation of nucleation events was conducted at the campus building of the Queensland University of Technology (QUT), located in an urban area of Brisbane. Overall, the annual average number concentrations of ultrafine, Aitken and nucleation mode particles were found to be 9.3 x 103, 3.7 x 103 and 5.6 x 103 cm-3, respectively. This was comparable to levels measured in urban areas of northern Europe, but lower than those from polluted urban areas such as the Yangtze River Delta, China and Huelva and Santa Cruz de Tenerife, Spain. Average particle number concentration (PNC) in the Brisbane region did not show significant seasonal variation, however a relatively large variation was observed during the warmer season. Diurnal variation of Aitken and nucleation mode particles displayed different patterns, which suggested that direct vehicle exhaust emissions were a major contributor of Aitken mode particles, while nucleation mode particles originated from vehicle exhaust emissions in the morning and photochemical production at around noon. A total of 65 nucleation events were observed during 2009, in which 40 events were classified as nucleation growth events and the remainder were nucleation burst events. An interesting observation in this study was that all nucleation growth events were associated with vehicle exhaust emission plumes, while the nucleation burst events were associated with industrial emission plumes from an industrial area. The average particle growth rate for nucleation events was found to be 4.6 nm hr-1 (ranging from 1.79-7.78 nm hr-1), which is comparable to other urban studies conducted in the United States, while monthly particle growth rates were found to be positively related to monthly solar radiation (r = 0.76, p <0.05). The particle growth rate values reported in this work are the first of their kind to be reported for the subtropical urban area of Australia. Furthermore, the influence of nucleation events on PNC within the urban airshed was also investigated. PNC was simultaneously measured at urban (QUT), roadside (Woolloongabba) and semi-urban (Rocklea) sites in Brisbane during 2009. Total PNC at these sites was found to be significantly affected by regional nucleation events. The relative fractions of PNC to total daily PNC observed at QUT, Woolloongabba and Rocklea were found to be 12%, 9% and 14%, respectively, during regional nucleation events. These values were higher than those observed as a result of vehicle exhaust emissions during weekday mornings, which ranged from 5.1-5.5% at QUT and Woolloongabba. In addition, PNC in the semi-urban area of Rocklea increased by a factor of 15.4 when it was upwind from urban pollution sources under the influence of nucleation burst events. Finally, we investigated the influence of sulfuric acid on new particle formation in the study region. A H2SO4 proxy was calculated by using [SO2], solar radiation and particle condensation sink data to represent the new particle production strength for the urban, roadside and semi-urban areas of Brisbane during the period June-July of 2009. The temporal variations of the H2SO4 proxies and the nucleation mode particle concentration were found to be in phase during nucleation events in the urban and roadside areas. In contrast, the peak of proxy concentration occurred 1-2 hr prior to the observed peak in nucleation mode particle concentration at the downwind semi-urban area of Brisbane. A moderate to strong linear relationship was found between the proxy and the freshly formed particles, with r2 values of 0.26-0.77 during the nucleation events. In addition, the log[H2SO4 proxy] required to produce new particles was found to be ~1.0 ppb Wm-2 s and below 0.5 ppb Wm-2 s for the urban and semi-urban areas, respectively. The particle growth rates were similar during nucleation events at the three study locations, with an average value of 2.7 ± 0.5 nm hr-1. This result suggested that a similar nucleation mechanism dominated in the study region, which was strongly related to sulphuric acid concentration, however the relationship between the proxy and PNC was poor in the semi-urban area of Rocklea. This can be explained by the fact that the nucleation process was initiated upwind of the site and the resultant particles were transported via the wind to Rocklea. This explanation is also supported by the higher geometric mean diameter value observed for particles during the nucleation event and the time lag relationship between the H2SO4 proxy and PNC observed at Rocklea. In summary, particle size distribution was continuously measured in a subtropical urban area of southern hemisphere during 2009, the findings from which formed the first particle size distribution dataset in the study region. The characteristics of nucleation events in the Brisbane region were quantified and the properties of the nucleation growth and burst events are discussed in detail using a case studies approach. To further investigate the influence of nucleation events on PNC in the study region, PNC was simultaneously measured at three locations to examine the spatial variation of PNC during the regional nucleation events. In addition, the impact of upwind urban pollution on the downwind semi-urban area was quantified during these nucleation events. Sulphuric acid was found to be an important factor influencing new particle formation in the urban and roadside areas of the study region, however, a direct relationship with nucleation events at the semi-urban site was not observed. This study provided an overview of new particle formation in the Brisbane region, and its influence on PNC in the surrounding area. The findings of this work are the first of their kind for an urban area in the southern hemisphere.
Resumo:
The rapid growth of visual information on Web has led to immense interest in multimedia information retrieval (MIR). While advancement in MIR systems has achieved some success in specific domains, particularly the content-based approaches, general Web users still struggle to find the images they want. Despite the success in content-based object recognition or concept extraction, the major problem in current Web image searching remains in the querying process. Since most online users only express their needs in semantic terms or objects, systems that utilize visual features (e.g., color or texture) to search images create a semantic gap which hinders general users from fully expressing their needs. In addition, query-by-example (QBE) retrieval imposes extra obstacles for exploratory search because users may not always have the representative image at hand or in mind when starting a search (i.e. the page zero problem). As a result, the majority of current online image search engines (e.g., Google, Yahoo, and Flickr) still primarily use textual queries to search. The problem with query-based retrieval systems is that they only capture users’ information need in terms of formal queries;; the implicit and abstract parts of users’ information needs are inevitably overlooked. Hence, users often struggle to formulate queries that best represent their needs, and some compromises have to be made. Studies of Web search logs suggest that multimedia searches are more difficult than textual Web searches, and Web image searching is the most difficult compared to video or audio searches. Hence, online users need to put in more effort when searching multimedia contents, especially for image searches. Most interactions in Web image searching occur during query reformulation. While log analysis provides intriguing views on how the majority of users search, their search needs or motivations are ultimately neglected. User studies on image searching have attempted to understand users’ search contexts in terms of users’ background (e.g., knowledge, profession, motivation for search and task types) and the search outcomes (e.g., use of retrieved images, search performance). However, these studies typically focused on particular domains with a selective group of professional users. General users’ Web image searching contexts and behaviors are little understood although they represent the majority of online image searching activities nowadays. We argue that only by understanding Web image users’ contexts can the current Web search engines further improve their usefulness and provide more efficient searches. In order to understand users’ search contexts, a user study was conducted based on university students’ Web image searching in News, Travel, and commercial Product domains. The three search domains were deliberately chosen to reflect image users’ interests in people, time, event, location, and objects. We investigated participants’ Web image searching behavior, with the focus on query reformulation and search strategies. Participants’ search contexts such as their search background, motivation for search, and search outcomes were gathered by questionnaires. The searching activity was recorded with participants’ think aloud data for analyzing significant search patterns. The relationships between participants’ search contexts and corresponding search strategies were discovered by Grounded Theory approach. Our key findings include the following aspects: - Effects of users' interactive intents on query reformulation patterns and search strategies - Effects of task domain on task specificity and task difficulty, as well as on some specific searching behaviors - Effects of searching experience on result expansion strategies A contextual image searching model was constructed based on these findings. The model helped us understand Web image searching from user perspective, and introduced a context-aware searching paradigm for current retrieval systems. A query recommendation tool was also developed to demonstrate how users’ query reformulation contexts can potentially contribute to more efficient searching.
Resumo:
Risk identification is one of the most challenging stages in the risk management process. Conventional risk management approaches provide little guidance and companies often rely on the knowledge of experts for risk identification. In this paper we demonstrate how risk indicators can be used to predict process delays via a method for configuring so-called Process Risk Indicators(PRIs). The method learns suitable configurations from past process behaviour recorded in event logs. To validate the approach we have implemented it as a plug-in of the ProM process mining framework and have conducted experiments using various data sets from a major insurance company.
Resumo:
This paper offers an analysis of the character animation in Tangled to develop a deeper understanding of how Disney has approached the extension of their traditional aesthetic into the CG medium.
Resumo:
Part of the chapter: "Sale of Sperm, Health Records, Minimally Conscious States, and Duties of Candour" Although ethical obligations and good medical practice guidelines clearly contemplate open disclosure, there is a dearth of authority as to the nature and extent of a legal duty on Australian doctors to disclose adverse events to patients.
Resumo:
Time plays an important role in norms. In this paper we start from our previously proposed classification of obligations, and point out some shortcomings of Event Calculus (EC) to represent obligations. We proposed an extension of EC that avoids such shortcomings and we show how to use it to model the various types of obligations.
Resumo:
In this practice-led research project I work to show how a re-reading and a particular form of listening to the sound-riddled nature of Gertrude Stein's work, Two: Gertrude Stein and her Brother, presents us with a contemporary theory of sound in language. This theory, though in its infancy, is a particular enjambment of sounded language that presents itself as an event, engaged with meaning, with its own inherent voice. It displays a propensity through engagement with the 'other' to erupt into love. In this thesis these qualities are reverberated further through the work of Seth Kim-Cohen's notion of the non-cochlear, Simon Jarvis's notion of musical thinking, Jean-Jacques Lecercle's notion of délire or nonsense, Luce Irigaray's notion of jouissant love and the Bracha Ettinger's notion of the generative matrixial border space. This reading then is simultaneously paired with my own work of scoring and creating a digital opera from Stein's work, thereby testing and performing Stein's theory. In this I show how a re-reading and relistening to Stein's work can be significant to feminist ethical language frames, contemporary philosophy, sonic art theory and digital language frames. Further significance of this study is that when the reverberation of Stein's engagements with language through sound can be listened to, a pattern emerges, one that encouragingly problematizes subjectivity and interweaves genres/methods and means, creating a new frame for sound in language, one with its own voice that I call soundage.
Resumo:
This paper elaborates the approach used by the Applied Data Mining Research Group (ADMRG) for the Social Event Detection (SED) Tasks of the 2013 MediaEval Benchmark. We extended the constrained clustering algorithm to apply to the first semi-supervised clustering task, and we compared several classifiers with Latent Dirichlet Allocation as feature selector in the second event classification task. The proposed approach focuses on scalability and efficient memory allocation when applied to a high dimensional data with large clusters. Results of the first task show the effectiveness of the proposed method. Results from task 2 indicate that attention on the imbalance categories distributions is needed.
Resumo:
In contemporary game development circles the ‘game making jam’ has become an important rite of passage and baptism event, an exploration space and a central indie lifestyle affirmation and community event. Game jams have recently become a focus for design researchers interested in the creative process. In this paper we tell the story of an established local game jam and our various documentation and data collection methods. We present the beginnings of the current project, which seeks to map the creative teams and their process in the space of the challenge, and which aims to enable participants to be more than the objects of the data collection. A perceived issue is that typical documentation approaches are ‘about’ the event as opposed to ‘made by’ the participants and are thus both at odds with the spirit of the jam as a phenomenon and do not really access the rich playful potential of participant experience. In the data collection and visualisation projects described here, we focus on using collected data to re-include the participants in telling stories about their experiences of the event as a place-based experience. Our goal is to find a means to encourage production of ‘anecdata’ - data based on individual story telling that is subjective, malleable, and resists collection via formal mechanisms - and to enable mimesis, or active narrating, on the part of the participants. We present a concept design for data as game based on the logic of early medieval maps and we reflect on how we could enable participation in the data collection itself.
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling. [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia