995 resultados para Jason Kramer
Resumo:
One of the perceived Achilles heels of online citizen journalism is its perceived inability to conduct investigative and first-hand reporting. A number of projects have recently addressed this problem, with varying success: the U.S.-based Assignment Zero was described as "a highly satisfying failure" (Howe 2007), while the German MyHeimat.de appears to have been thoroughly successful in attracting a strong community of contributors, even to the point of being able to generate print versions of its content, distributed free of charge to households in selected German cities. In Australia, citizen journalism played a prominent part in covering the federal elections held on 24 November 2007; news bloggers and public opinion Websites provided a strong counterpoint to the mainstream media coverage of the election campaign (Bruns et al., 2007). Youdecide2007.org, a collaboration between researchers at Queensland University of Technology and media practitioners at the public service broadcaster SBS, the public opinion site On Line Opinion, and technology company Cisco Systems, was developed as a dedicated space for a specifically hyperlocal coverage of the election campaign in each of Australia's 150 electorates from the urban sprawls of Sydney and Brisbane to the sparsely populated remote regions of outback Australia. YD07 provided training materials for would-be citizen journalists and encouraged them to contribute electorate profiles, interview candidates, and conduct vox-pops with citizens in their local area. The site developed a strong following especially in its home state of Queensland, and its interviewers influenced national public debate by uncovering the sometimes controversial personal views of mainstream and fringe candidates. At the same time, the success of YD07 was limited by external constraints determined by campaign timing and institutional frameworks. As part of a continuing action research cycle, lessons learnt from Youdecide2007.org are going to be translated into further iterations of the project, which will cover the local government elections in the Australian state of Queensland, to be held in March 2008, and developments subsequent to these elections. This paper will present research outcomes from the Youdecide2007.org project. In particular, it will examine the roles of staff contributors and citizen journalists in attracting members, providing information, promoting discussion, and fostering community on the site: early indications from a study of interaction data on the site indicate notably different contribution patterns and effects for staff and citizen participants, which may point towards the possibility of developing more explicit pro-am collaboration models in line with the Pro-Am phenomenon outlined by Leadbeater & Miller (2004). The paper will outline strengths and weaknesses of the Youdecide model and highlight requirements for the successful development of active citizen journalism communities. In doing so, it will also evaluate the feasibility of hyperlocal citizen journalism approaches, and their interrelationship with broader regional, state, and national journalism in both its citizen and industrial forms.
Resumo:
This paper proposes a novel relative entropy rate (RER) based approach for multiple HMM (MHMM) approximation of a class of discrete-time uncertain processes. Under different uncertainty assumptions, the model design problem is posed either as a min-max optimisation problem or stochastic minimisation problem on the RER between joint laws describing the state and output processes (rather than the more usual RER between output processes). A suitable filter is proposed for which performance results are established which bound conditional mean estimation performance and show that estimation performance improves as the RER is reduced. These filter consistency and convergence bounds are the first results characterising multiple HMM approximation performance and suggest that joint RER concepts provide a useful model selection criteria. The proposed model design process and MHMM filter are demonstrated on an important image processing dim-target detection problem.
Resumo:
This work aims to take advantage of recent developments in joint factor analysis (JFA) in the context of a phonetically conditioned GMM speaker verification system. Previous work has shown performance advantages through phonetic conditioning, but this has not been shown to date with the JFA framework. Our focus is particularly on strategies for combining the phone-conditioned systems. We show that the classic fusion of the scores is suboptimal when using multiple GMM systems. We investigate several combination strategies in the model space, and demonstrate improvement over score-level combination as well as over a non-phonetic baseline system. This work was conducted during the 2008 CLSP Workshop at Johns Hopkins University.
Resumo:
The requirement for improved efficiency whilst maintaining system security necessitates the development of improved system analysis approaches and the development of advanced emergency control technologies. Load shedding is a type of emergency control that is designed to ensure system stability by curtailing system load to match generation supply. This paper presents a new adaptive load shedding scheme that provides emergency protection against excess frequency decline, whilst minimizing the risk of line overloading. The proposed load shedding scheme uses the local frequency rate information to adapt the load shedding behaviour to suit the size and location of the experienced disturbance. The proposed scheme is tested in simulation on a 3-region, 10-generator sample system and shows good performance.
Resumo:
How and why visualisations support learning was the subject of this qualitative instrumental collective case study. Five computer programming languages (PHP, Visual Basic, Alice, GameMaker, and RoboLab) supporting differing degrees of visualisation were used as cases to explore the effectiveness of software visualisation to develop fundamental computer programming concepts (sequence, iteration, selection, and modularity). Cognitive theories of visual and auditory processing, cognitive load, and mental models provided a framework in which student cognitive development was tracked and measured by thirty-one 15-17 year old students drawn from a Queensland metropolitan secondary private girls’ school, as active participants in the research. Seventeen findings in three sections increase our understanding of the effects of visualisation on the learning process. The study extended the use of mental model theory to track the learning process, and demonstrated application of student research based metacognitive analysis on individual and peer cognitive development as a means to support research and as an approach to teaching. The findings also forward an explanation for failures in previous software visualisation studies, in particular the study has demonstrated that for the cases examined, where complex concepts are being developed, the mixing of auditory (or text) and visual elements can result in excessive cognitive load and impede learning. This finding provides a framework for selecting the most appropriate instructional programming language based on the cognitive complexity of the concepts under study.
Resumo:
This study aimed to develop and assess the reliability and validity of a pair of self-report questionnaires to measure self-efficacy and expectancy associated with benzodiazepine use, the Benzodiazepine Refusal Self- Efficacy Questionnaire (BRSEQ) and the Benzodiazepine Expectancy Questionnaire (BEQ). Internal structure of the questionnaireswas established by principal component analysis (PCA) in a sample of 155 respondents, and verified by confirmatory factor analyses (CFA) in a second independent sample (n=139) using structural equation modeling. The PCA of the BRSEQ resulted in a 16-item, 4-factor scale, and the BEQ formed an 18-item, 2-factor scale. Both scales were internally reliable. CFA confirmed these internal structures and reduced the questionnaires to a 14-item self-efficacy scale and a 12-item expectancy scale. Lower self-efficacy and higher expectancy were moderately associated with higher scores on the SDS-B. The scales provide reliable measures for assessing benzodiazepine self-efficacy and expectancies. Future research will examine the utility of the scales in prospective prediction of benzodiazepine cessation.
Resumo:
This paper investigates self–Googling through the monitoring of search engine activities of users and adds to the few quantitative studies on this topic already in existence. We explore this phenomenon by answering the following questions: To what extent is the self–Googling visible in the usage of search engines; is any significant difference measurable between queries related to self–Googling and generic search queries; to what extent do self–Googling search requests match the selected personalised Web pages? To address these questions we explore the theory of narcissism in order to help define self–Googling and present the results from a 14–month online experiment using Google search engine usage data.
Resumo:
This paper examines a sequence of asynchronous interaction on the photosharing website, Flickr. In responding to a call for a focus on the performative aspects of online annotation (Wolff & Neuwirth, 2001), we outline and apply an interaction order approach to identify temporal and cultural aspects of the setting that provide for commonality and sharing. In particular, we study the interaction as a feature of a synthetic situation (Knorr Cetina, 2009) focusing on the requirements of maintaining a sense of an ongoing discussion online. Our analysis suggests that the rhetorical system of the Flickr environment, its appropriation by participants as a context for bounded activities, and displays of commonality, affiliation, and shared access provide for a common sense of participation in a time envelope. This, in turn, is argued to be central to new processes of consociation (Schutz, 1967; Zhao, 2004) occurring in the life world of Web 2.0 environments.
Resumo:
This work presents an extended Joint Factor Analysis model including explicit modelling of unwanted within-session variability. The goals of the proposed extended JFA model are to improve verification performance with short utterances by compensating for the effects of limited or imbalanced phonetic coverage, and to produce a flexible JFA model that is effective over a wide range of utterance lengths without adjusting model parameters such as retraining session subspaces. Experimental results on the 2006 NIST SRE corpus demonstrate the flexibility of the proposed model by providing competitive results over a wide range of utterance lengths without retraining and also yielding modest improvements in a number of conditions over current state-of-the-art.
Resumo:
Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.
Resumo:
This paper argues that media and communications theory, as with cultural and creative industries analysis, can benefit from a deeper understanding of economic growth theory. Economic growth theory is elucidated in the context of both cultural and media studies and with respect to modern Chinese economic development. Economic growth is a complex evolutionary process that is tightly integrated with socio-cultural and political processes. This paper seeks to explore this mechanism and to advance cultural theory from an erstwhile political economy perspective to one centred about the co-evolutionary dynamics of economic and socio-political systems. A generic model is presented in which economic and social systems co-evolve through the origination, adoption and retention of new ideas, and in which the creative industries are a key part of this process. The paper concludes that digital media capabilities are a primary source of economic development.
Resumo:
This paper proposes that the 'creative industries'(CIs) play an important yet widely unexamined function in economic evolution through their role in the innovation process. This occurs in terms of the facilitation of demand for novelty, the provision and development of social technologies for producer-consumer interactions, and the adoption and embedding of new technologies as institutions. The incorporation of CIs into the Schumpeterian model of economic evolution thus fills a notable gap in the social technologies of the origination, adoption and retention of innovation.
Resumo:
The benefits of openness are widely apparent everywhere except, seemingly, in occupations. Yet the case against occupational licensing still remains strong. Consideration of dynamic costs strengthens the case further.
Resumo:
It has long been recognised that government and public sector services suffer an innovation deficit compared to private or market-based services. This paper argues that this can be explained as an unintended consequence of the concerted public sector drive toward the elimination of waste through efficiency, accountability and transparency. Yet in an evolving economy this can be a false efficiency, as it also eliminates the 'good waste' that is a necessary cost of experimentation. This results in a systematic trade0off in the public sector between the static efficiency of minimizing the misuse of public resources and the dynamic efficiency of experimentation. this is inherently biased against risk and uncertainty and therein, explains why governments find service innovation so difficult. In the drive to eliminate static inefficiencies, many political systems have susequently overshot and stifled policy innovation. I propose the 'Red Queen' solution of adaptive economic policy.
Resumo:
We propose that a general analytic framework for cultural science can be constructed as a generalization of the generic micro meso macro framework proposed by Dopfer and Potts (2008). This paper outlines this argument along with some implications for the creative industries research agenda.