954 resultados para Inherent Audiences


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this chapter outlines an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well established that there are inherent difficulties involved in communicating across cultural boundaries. When these difficulties are encountered within the justice system the innocent can be convicted and witnesses undermined. A large amount of research has been undertaken regarding the implications of miscommunication within the courtroom but far less has been carried out on language and interactions between police and Indigenous Australians. It is necessary that officers of the law be made aware of linguistic issues to ensure they conduct their investigations in a fair, effective and therefore ethical manner. This paper draws on Cultural Schema Theory to illustrate how this could be achieved. The justice system is reliant upon the skills and knowledge of the police, therefore, this paper highlights the need for research to focus on the linguistic and non‐verbal differences between Australian Aboriginal English and Australian Standard English in order to develop techniques to facilitate effective communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the inherent mechanism of social benefits associated with smart grid development is examined based on the pressure state response (PSR) model from resource economics. The emerging types of technology brought up by smart grid development are regarded as pressures. The improvements of the performance and efficiency of power system operation, such as the enhanced capability of accommodating renewable energy generation, are regarded as states. The effects of smart grid development on society are regarded as responses. Then, a novel method for evaluating social benefits from smart grid development is presented. Finally, the social benefits from smart grid development in a province in northwest China are carried out by using the developed evaluation system, and reasonable evaluation results are attained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During invasion and metastasis, cancer cells interact closely with the extracellular matrix molecules by attachment, degradation, and migration. We demonstrated previously the local degradation of fluorescently labeled gelatin matrix by cancer cells at invasive membrane protrusions, called invadopodia. Using the newly developed quantitative fluorescence-activated cell sorting-phagocytosis assay and image analysis of localized degradation of fluorescently labeled matrix, we document here that degradation and site- specific removal of cross-linked gelatin matrix is correlated with the extent of phagocytosis in human breast cancer cells. A higher phagocytic capacity is generally associated with increasing invasiveness, documented in other invasion and motility assays as well. Gelatin phagocytosis is time and cell density dependent, and it is mediated by the actin cytoskeleton. Most of the intracellular gelatin is routed to actively acidified vesicles, as demonstrated by the fluorescent colocalization of gelatin with acidic vesicles, indicating the intracellular degradation of the phagocytosed matrix in lysosomes. We show here that normal intracellular routing is blocked after treatment with acidification inhibitors. In addition, the need for partial proteolytic degradation of the matrix prior to phagocytosis is demonstrated by the inhibition of gelatin phagocytosis with different serine and metalloproteinase inhibitors and its stimulation by conditioned medium containing the matrix metalloproteinases MMP-2 and MMP-9. Our results demonstrate that phagocytosis of extracellular matrix is an inherent feature of breast tumor cells that correlates with and may even directly contribute to their invasive capacity. This assay is useful for screening and evaluating potential anti-invasive agents because it is fast, reproducible, and versatile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the modeling and position-sensorless vector control of a dual-airgap axial flux permanent magnet (AFPM) machine optimized for use in flywheel energy storage system (FESS) applications. The proposed AFPM machine has two sets of three-phase stator windings but requires only a single power converter to control both the electromagnetic torque and the axial levitation force. The proper controllability of the latter is crucial as it can be utilized to minimize the vertical bearing stress to improve the efficiency of the FESS. The method for controlling both the speed and axial displacement of the machine is discussed. An inherent speed sensorless observer is also proposed for speed estimation. The proposed observer eliminates the rotary encoder, which in turn reduces the overall weight and cost of the system while improving its reliability. The effectiveness of the proposed control scheme has been verified by simulations and experiments on a prototype machine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital forensics concerns the analysis of electronic artifacts to reconstruct events such as cyber crimes. This research produced a framework to support forensic analyses by identifying associations in digital evidence using metadata. It showed that metadata based associations can help uncover the inherent relationships between heterogeneous digital artifacts thereby aiding reconstruction of past events by identifying artifact dependencies and time sequencing. It also showed that metadata association based analysis is amenable to automation by virtue of the ubiquitous nature of metadata across forensic disk images, files, system and application logs and network packet captures. The results prove that metadata based associations can be used to extract meaningful relationships between digital artifacts, thus potentially benefiting real-life forensics investigations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Differential settlement at the bridge approach between the deck and rail track on ground is often considered as a source of challenging technical and economical problem. This caused by the sudden stiffness changes between the bridge deck and the track on ground, and changes in soil stiffness of backfill and sub-grade with soil moisture content and loading history. To minimise the negative social and economic impacts due to poor performances of railway tracks at bridge transition zones, it is important, a special attention to be given at design, construction and maintenance stages. It is critically challenging to obtain an appropriate design solution for any given site condition and most of the existing conventional design approaches are unable to address the actual on-site behaviour due to their inherent assumptions of continuity and lack of clarifying of the local effects. An evaluation of existing design techniques is considered to estimate their contributions to a potential solution for bridge transition zones. This paper analyses five different approaches: the Chinese Standard, the European Standard with three different approaches, and the Australian approach. Each design approach is used to calculate the layer thicknesses, accounting critical design features such as the train speed, the axle load, the backfill subgrade condition, and the dynamic loading response. Considering correlation between track degradation and design parameters, this paper concludes that there is still a need of an optimised design approach for bridge transition zones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A commitment in 2010 by the Australian Federal Government to spend $466.7 million dollars on the implementation of personally controlled electronic health records (PCEHR) heralded a shift to a more effective and safer patient centric eHealth system. However, deployment of the PCEHR has met with much criticism, emphasised by poor adoption rates over the first 12 months of operation. An indifferent response by the public and healthcare providers largely sceptical of its utility and safety speaks to the complex sociotechnical drivers and obstacles inherent in the embedding of large (national) scale eHealth projects. With government efforts to inflate consumer and practitioner engagement numbers giving rise to further consumer disillusionment, broader utilitarian opportunities available with the PCEHR are at risk. This paper discusses the implications of establishing the PCEHR as the cornerstone of a holistic eHealth strategy for the aggregation of longitudinal patient information. A viewpoint is offered that the real value in patient data lies not just in the collection of data but in the integration of this information into clinical processes within the framework of a commoditised data-driven approach. Consideration is given to the eHealth-as-a-Service (eHaaS) construct as a disruptive next step for co-ordinated individualised healthcare in the Australian context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Protocols for bioassessment often relate changes in summary metrics that describe aspects of biotic assemblage structure and function to environmental stress. Biotic assessment using multimetric indices now forms the basis for setting regulatory standards for stream quality and a range of other goals related to water resource management in the USA and elsewhere. Biotic metrics are typically interpreted with reference to the expected natural state to evaluate whether a site is degraded. It is critical that natural variation in biotic metrics along environmental gradients is adequately accounted for, in order to quantify human disturbance-induced change. A common approach used in the IBI is to examine scatter plots of variation in a given metric along a single stream size surrogate and a fit a line (drawn by eye) to form the upper bound, and hence define the maximum likely value of a given metric in a site of a given environmental characteristic (termed the 'maximum species richness line' - MSRL). In this paper we examine whether the use of a single environmental descriptor and the MSRL is appropriate for defining the reference condition for a biotic metric (fish species richness) and for detecting human disturbance gradients in rivers of south-eastern Queensland, Australia. We compare the accuracy and precision of the MSRL approach based on single environmental predictors, with three regression-based prediction methods (Simple Linear Regression, Generalised Linear Modelling and Regression Tree modelling) that use (either singly or in combination) a set of landscape and local scale environmental variables as predictors of species richness. We compared the frequency of classification errors from each method against set biocriteria and contrast the ability of each method to accurately reflect human disturbance gradients at a large set of test sites. The results of this study suggest that the MSRL based upon variation in a single environmental descriptor could not accurately predict species richness at minimally disturbed sites when compared with SLR's based on equivalent environmental variables. Regression-based modelling incorporating multiple environmental variables as predictors more accurately explained natural variation in species richness than did simple models using single environmental predictors. Prediction error arising from the MSRL was substantially higher than for the regression methods and led to an increased frequency of Type I errors (incorrectly classing a site as disturbed). We suggest that problems with the MSRL arise from the inherent scoring procedure used and that it is limited to predicting variation in the dependent variable along a single environmental gradient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractional differential equations have been increasingly used as a powerful tool to model the non-locality and spatial heterogeneity inherent in many real-world problems. However, a constant challenge faced by researchers in this area is the high computational expense of obtaining numerical solutions of these fractional models, owing to the non-local nature of fractional derivatives. In this paper, we introduce a finite volume scheme with preconditioned Lanczos method as an attractive and high-efficiency approach for solving two-dimensional space-fractional reaction–diffusion equations. The computational heart of this approach is the efficient computation of a matrix-function-vector product f(A)bf(A)b, where A A is the matrix representation of the Laplacian obtained from the finite volume method and is non-symmetric. A key aspect of our proposed approach is that the popular Lanczos method for symmetric matrices is applied to this non-symmetric problem, after a suitable transformation. Furthermore, the convergence of the Lanczos method is greatly improved by incorporating a preconditioner. Our approach is show-cased by solving the fractional Fisher equation including a validation of the solution and an analysis of the behaviour of the model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The limited terms in which international production is currently discussed in Australia do not allow for serious consideration of the multiple and complex ways such production enables new connections with filmmakers and audiences around the world. The narrowness of the debate also prevents us from considering fully what that production entails for Australian cinema, what it means, who it speaks to, and how it could spark new conversations about the possibilities of filmmaking and storytelling in this country.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Notwithstanding the problems with identifying audiences (c.f. Hartley, 1987), nor with sampling them (c.f. Turner, 2005), we contend that by using social media, it is at least possible to gain an understanding of the habits of those who chose to engage with content through social media. In this chapter, we will broadly outline the ways in which networks such as Twitter and Facebook can stand as proxies for audiences in a number of scenarios, and enable content creators, networks and researchers to understand the ways in which audiences come into existence, change over time, and engage with content. Beginning with the classic audience – television – we will consider the evolution of metrics from baseline volume metrics to the more sophisticated ‘telemetrics’ that are the focus of our current work. We discuss the evolution of these metrics, from principles developed in the field of ‘sabermetrics’, and highlight their effectiveness as both a predictor and a baseline for producers and networks to measure the success of their social media campaigns. Moving beyond the evaluation of the audiences engagement, we then move to consider the ‘audiences’ themselves. Building on Hartley’s argument that audiences are “imagined” constructs (1987, p. 125), we demonstrate the continual shift of Australian television audiences, from episode to episode and series to series, demonstrating through our map of the Australian Twittersphere (Bruns, Burgess & Highfield, 2014) both the variation amongst those who directly engage with television content, and those who are exposed to it through their social media networks. Finally, by exploring overlaps between sporting events (such as the NRL and AFL Grand Finals), reality TV (such as Big Brother, My Kitchen Rules & Biggest Loser), soaps (e.g. Bold & The Beautiful, Home & Away), and current affairs programming (e.g. Morning Television & A Current Affair), we discuss to what extent it is possible to profile and categorize Australian television audiences. Finally, we move beyond television audiences to consider audiences around social media platforms themselves. Building on our map of the Australian Twittersphere (Bruns, Burgess & Highfield, 2014), and a pool of 5000 active Australian accounts, we discuss the interconnectedness of audiences around particular subjects, and how specific topics spread throughout the Twitter Userbase. Also, by using Twitter as a proxy, we consider the career of a number of popular YouTuber’s, utilizing a method we refer to as Twitter Accession charts (Bruns & Woodford, 2014) to identify the growth curves, and relate them to specific events in the YouTubers career, be that ‘viral’ videos or collaborations, to discuss how audiences form around specific content creators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"This work forms part of a much larger collaborative album project in progress between Tim Bruniges, Julian Knowles and David Trumpmanis which explores the intersections between traditional rock instrumentation and analogue and digital media. All of the creative team are performers, composers and producers. The material for the album was thus generated by a series of in studio improvisations and performances with each collaborator assuming a range of different and alternating roles – guitars, electronics, drums, percussion, bass, keyboards production. Thematically the work explores the intersection of instrumental (post) rock, ambient music, and historical electro-acoustic tape composition traditions. Over the past 10 years, musical practice has become increasingly hybrid, with the traditional boundaries between genre becoming progressively eroded. At the same time, digital tools have replaced many of the major analogue technologies that dominated music production and performance in the 20th century. The disappearance of analogue media in mainstream musical practice has had a profound effect on the sonic characteristics of contemporary music and the gestural basis for its production. Despite the increasing power of digital technologies, a small but dedicated group of practitioners has continued to prize and use analogue technology for its unique sounds and the non-linearity of the media, aestheticising its inherent limitations and flaws. At the most radical end of this spectrum lie glitch and lo-fi musical forms, seen in part as reactions to the clinical nature of digital media and the perceived lack of character associated with its transparency. Such developments have also problematised the traditional relationships between media and genre, where specific techniques and their associated sounds have become genre markers. Tristate is an investigation into this emerging set of dialogues between analogue and digital media across composition, production and performance. It employs analogue tape loops in performance, where a tape machine ‘performer’ records and hand manipulates loops of an electric guitar performer on ‘destroyed’ tape stock (intentionally damaged tape), processing the output of this analogue system in the digital domain with contemporary sound processors. In doing so it investigates how the most extreme sonic signatures of analogue media – tape dropout and noise – can be employed alongside contemporary digital sound gestures in both compositional and performance contexts and how the extremes of the two media signatures can brought together both compositionally and performatively. In respect of genre, the work established strategies for merging compositional techniques from the early musique concrete tradition of the 1940s with late 60s popular music experimentalism and the laptop glitch electronica movement of the early 2000s. Lastly, the work explores how analogue recording studio technologies can be used as performance tools, thus illuminating and foregrounding the performative/gestural dimensions of traditional analogue studio tools in use."

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vertical graphene nanosheets (VGNS) hold great promise for high-performance supercapacitors owing to their excellent electrical transport property, large surface area and in particular, an inherent three-dimensional, open network structure. However, it remains challenging to materialise the VGNS-based supercapacitors due to their poor specific capacitance, high temperature processing, poor binding to electrode support materials, uncontrollable microstructure, and non-cost effective way of fabrication. Here we use a single-step, fast, scalable, and environmentally-benign plasma-enabled method to fabricate VGNS using cheap and spreadable natural fatty precursor butter, and demonstrate the controllability over the degree of graphitization and the density of VGNS edge planes. Our VGNS employed as binder-free supercapacitor electrodes exhibit high specific capacitance up to 230 F g−1 at a scan rate of 10 mV s−1 and >99% capacitance retention after 1,500 charge-discharge cycles at a high current density, when the optimum combination of graphitic structure and edge plane effects is utilised. The energy storage performance can be further enhanced by forming stable hybrid MnO2/VGNS nano-architectures which synergistically combine the advantages from both VGNS and MnO2. This deterministic and plasma-unique way of fabricating VGNS may open a new avenue for producing functional nanomaterials for advanced energy storage devices.