961 resultados para computational media aesthetics
Resumo:
Social media influence analysis, sometimes also called authority detection, aims to rank users based on their influence scores in social media. Existing approaches of social influence analysis usually focus on how to develop effective algorithms to quantize users’ influence scores. They rarely consider a person’s expertise levels which are arguably important to influence measures. In this paper, we propose a computational approach to measuring the correlation between expertise and social media influence, and we take a new perspective to understand social media influence by incorporating expertise into influence analysis. We carefully constructed a large dataset of 13,684 Chinese celebrities from Sina Weibo (literally ”Sina microblogging”). We found that there is a strong correlation between expertise levels and social media influence scores. Our analysis gave a good explanation of the phenomenon of “top across-domain influencers”. In addition, different expertise levels showed influence variation patterns: e.g., (1) high-expertise celebrities have stronger influence on the “audience” in their expertise domains; (2) expertise seems to be more important than relevance and participation for social media influence; (3) the audiences of top expertise celebrities are more likely to forward tweets on topics outside the expertise domains from high-expertise celebrities.
Resumo:
An unstructured mesh �nite volume discretisation method for simulating di�usion in anisotropic media in two-dimensional space is discussed. This technique is considered as an extension of the fully implicit hybrid control-volume �nite-element method and it retains the local continuity of the ux at the control volume faces. A least squares function recon- struction technique together with a new ux decomposition strategy is used to obtain an accurate ux approximation at the control volume face, ensuring that the overall accuracy of the spatial discretisation maintains second order. This paper highlights that the new technique coincides with the traditional shape function technique when the correction term is neglected and that it signi�cantly increases the accuracy of the previous linear scheme on coarse meshes when applied to media that exhibit very strong to extreme anisotropy ratios. It is concluded that the method can be used on both regular and irregular meshes, and appears independent of the mesh quality.
Resumo:
This chapter traces the development of the global digital storytelling movement from its origins in California to its adoption by the BBC in the UK and its subsequent dispersal around the world. It identifies the foundational practices, uneven development and diffusion, and emergent practices internationally.
Resumo:
Network Jamming systems provide real-time collaborative media performance experiences for novice or inexperienced users. In this paper we will outline the theoretical and developmental drivers for our Network Jamming software, called jam2jam. jam2jam employs generative algorithmic techniques with particular implications for accessibility and learning. We will describe how theories of engagement have directed the design and development of jam2jam and show how iterative testing cycles in numerous international sites have informed the evolution of the system and its educational potential. Generative media systems present an opportunity for users to leverage computational systems to make sense of complex media forms through interactive and collaborative experiences. Generative music and art are a relatively new phenomenon that use procedural invention as a creative technique to produce music and visual media. These kinds of systems present a range of affordances that can facilitate new kinds of relationships with music and media performance and production. Early systems have demonstrated the potential to provide access to collaborative ensemble experiences to users with little formal musical or artistic expertise.This presentation examines the educational affordances of these systems evidenced by field data drawn from the Network Jamming Project. These generative performance systems enable access to a unique kind of music/media’ ensemble performance with very little musical/ media knowledge or skill and they further offer the possibility of unique interactive relationships with artists and creative knowledge through collaborative performance. Through the process of observing, documenting and analysing young people interacting with the generative media software jam2jam a theory of meaningful engagement has emerged from the need to describe and codify how users experience creative engagement with music/media performance and the locations of meaning. In this research we observed that the musical metaphors and practices of ‘ensemble’ or collaborative performance and improvisation as a creative process for experienced musicians can be made available to novice users. The relational meanings of these musical practices afford access to high level personal, social and cultural experiences. Within the creative process of collaborative improvisation lie a series of modes of creative engagement that move from appreciation through exploration, selection, direction toward embodiment. The expressive sounds and visions made in real-time by improvisers collaborating are immediate and compelling. Generative media systems let novices access these experiences with simple interfaces that allow them to make highly professional and expressive sonic and visual content simply by using gestures and being attentive and perceptive to their collaborators. These kinds of experiences present the potential for highly complex expressive interactions with sound and media as a performance. Evidence that has emerged from this research suggest that collaborative performance with generative media is transformative and meaningful. In this presentation we draw out these ideas around an emerging theory of meaningful engagement that has evolved from the development of network jamming software. Primarily we focus on demonstrating how these experiences might lead to understandings that may be of educational and social benefit.
Resumo:
Extensive groundwater withdrawal has resulted in a severe seawater intrusion problem in the Gooburrum aquifers at Bundaberg, Queensland, Australia. Better management strategies can be implemented by understanding the seawater intrusion processes in those aquifers. To study the seawater intrusion process in the region, a two-dimensional density-dependent, saturated and unsaturated flow and transport computational model is used. The model consists of a coupled system of two non-linear partial differential equations. The first equation describes the flow of a variable-density fluid, and the second equation describes the transport of dissolved salt. A two-dimensional control volume finite element model is developed for simulating the seawater intrusion into the heterogeneous aquifer system at Gooburrum. The simulation results provide a realistic mechanism by which to study the convoluted transport phenomena evolving in this complex heterogeneous coastal aquifer.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
There are at least four key challenges in the online news environment that computational journalism may address. Firstly, news providers operate in a rapidly evolving environment and larger businesses are typically slower to adapt to market innovations. News consumption patterns have changed and news providers need to find new ways to capture and retain digital users. Meanwhile, declining financial performance has led to cost cuts in mass market newspapers. Finally investigative reporting is typically slow, high cost and may be tedious, and yet is valuable to the reputation of a news provider. Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, social science and communications. New technologies may enhance the traditional aims of journalism, or may require “a new breed of people who are midway between technologists and journalists” (Irfan Essa in Mecklin 2009: 3). Historically referred to as ‘computer assisted reporting’, the use of software in online reportage is increasingly valuable due to three factors: larger datasets are becoming publicly available; software is becoming sophisticated and ubiquitous; and the developing Australian digital economy. This paper introduces key elements of computational journalism – it describes why it is needed; what it involves; benefits and challenges; and provides a case study and examples. Computational techniques can quickly provide a solid factual basis for original investigative journalism and may increase interaction with readers, when correctly used. It is a major opportunity to enhance the delivery of original investigative journalism, which ultimately may attract and retain readers online.
Resumo:
In this work two different finite volume computational strategies for solving a representative two-dimensional diffusion equation in an orthotropic medium are considered. When the diffusivity tensor is treated as linear, this problem admits an analytic solution used for analysing the accuracy of the proposed numerical methods. In the first method, the gradient approximation techniques discussed by Jayantha and Turner [Numerical Heat Transfer, Part B: Fundamentals, 40, pp.367–390, 2001] are applied directly to the
Resumo:
We present a mass-conservative vertex-centred finite volume method for efficiently solving the mixed form of Richards’ equation in heterogeneous porous media. The spatial discretisation is particularly well-suited to heterogeneous media because it produces consistent flux approximations at quadrature points where material properties are continuous. Combined with the method of lines, the spatial discretisation gives a set of differential algebraic equations amenable to solution using higher-order implicit solvers. We investigate the solution of the mixed form using a Jacobian-free inexact Newton solver, which requires the solution of an extra variable for each node in the mesh compared to the pressure-head form. By exploiting the structure of the Jacobian for the mixed form, the size of the preconditioner is reduced to that for the pressure-head form, and there is minimal computational overhead for solving the mixed form. The proposed formulation is tested on two challenging test problems. The solutions from the new formulation offer conservation of mass at least one order of magnitude more accurate than a pressure head formulation, and the higher-order temporal integration significantly improves both the mass balance and computational efficiency of the solution.
Resumo:
An improved mesoscopic model is presented for simulating the drying of porous media. The aim of this model is to account for two scales simultaneously: the scale of the whole product and the scale of the heterogeneities of the porous medium. The innovation of this method is the utilization of a new mass-conservative scheme based on the Control-Volume Finite-Element (CV-FE) method that partitions the moisture content field over the individual sub-control volumes surrounding each node within the mesh. Although the new formulation has potential for application across a wide range of transport processes in heterogeneous porous media, the focus here is on applying the model to the drying of small sections of softwood consisting of several growth rings. The results conclude that, when compared to a previously published scheme, only the new mass-conservative formulation correctly captures the true moisture content evolution in the earlywood and latewood components of the growth rings during drying.
Resumo:
Discrete stochastic simulations, via techniques such as the Stochastic Simulation Algorithm (SSA) are a powerful tool for understanding the dynamics of chemical kinetics when there are low numbers of certain molecular species. However, an important constraint is the assumption of well-mixedness and homogeneity. In this paper, we show how to use Monte Carlo simulations to estimate an anomalous diffusion parameter that encapsulates the crowdedness of the spatial environment. We then use this parameter to replace the rate constants of bimolecular reactions by a time-dependent power law to produce an SSA valid in cases where anomalous diffusion occurs or the system is not well-mixed (ASSA). Simulations then show that ASSA can successfully predict the temporal dynamics of chemical kinetics in a spatially constrained environment.
Resumo:
Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, the social sciences, and media and communications. New technologies may enhance the traditional aims of journalism, or may initiate greater interaction between journalists and information and communication technology (ICT) specialists. The enhanced use of computing in news production is related in particular to three factors: larger government data sets becoming more widely available; the increasingly sophisticated and ubiquitous nature of software; and the developing digital economy. Drawing upon international examples, this paper argues that computational journalism techniques may provide new foundations for original investigative journalism and increase the scope for new forms of interaction with readers. Computer journalism provides a major opportunity to enhance the delivery of original investigative journalism, and to attract and retain readers online.
Resumo:
The global release of 250,000 US Embassy diplomatic cables to selected media sites worldwide through the WikiLeaks website, was arguably the major global media event of 2010. As well as the implications of the content of the cables for international politics and diplomacy, the actions of WikiLeaks and its controversial editor-in-chief, the Australian Julian Assange, bring together a range of arguments about how the media, news and journalism are being transformed in the 21st century. This paper will focus on the reactions of Australian online news media sites to the release of the diplomatic cables by WikiLeaks, including both the online sites of established news outlets such as The Australian, Sydney Morning Herald and The Age, the ABC’s The Drum site, and online-only sites such as Crikey, New Matilda and On Line Opinion. The study focuses on opinion and commentary rather than straight news reportage, and analysis is framed around three issues: WikiLeaks and international diplomacy; implications of WikiLeaks for journalism; and WikiLeaks and democracy, including debates about the organisation and the ethics of its own practice. It also whether a “WikiLeaks Effect” has wider implications for how journalism is conducted in the future, particularly the method of ‘redaction’ of large amounts of computational data.