450 resultados para methodologies
Resumo:
As media institutions are encouraged to explore new production methodologies in the current economic crisis, they align with Schumpeter’s creative destruction provocation by exhibiting user-led political, organisation and socio-technical innovations. This paper highlights the significance of the cultural intermediary within the innovative, co-creative production arrangements for cultural artefacts by media professionals in institutional online communities. An institutional online community is defined as one that is housed, resourced and governed by commercial or non- commercial institutions and is not independently facilitated. Web 2.0 technologies have mobilised collaborative peer production activities for online content creation and professional media institutions face challenges in engaging participatory audiences in practices that are beneficial for all concerned stakeholders. The interests of those stakeholders often do not align, highlighting the need for an intermediary role that understands and translates the norms, rhetoric tropes and day-to-day activities between the individuals engaging in participatory communication activities for successful negotiation within the production process. This paper specifically explores the participatory relationship between the public service broadcaster (PSB), the Australian Broadcasting Corporation (ABC) and one of its online communities, ABC Pool (www.abc.net.au/pool). ABC Pool is an online platform developed and resourced by the ABC to encourage co-creation between audience members engaging in the production of user-generated content (UGC) and the professional producers housed within the ABC Radio Division. This empirical research emerges from a three-year research project where I employed an ethnographic action research methodology and was embedded at the ABC as the community manager of ABC Pool. In participatory communication environments, users favour meritocratic heterarchical governance over traditional institutional hierarchical systems (Malaby 2009). A reputation environment based on meritocracy requires an intermediary to identify the stakeholders, understand their interests and communicate effectively between them to negotiate successful production outcomes (Bruns 2008; Banks 2009). The community manager generally occupies this role, however it has emerged that other institutional production environments also employ an intermediary role under alternative monikers(Hutchinson 2012). A useful umbrella term to encompass the myriad of roles within this space is the cultural intermediary. The ABC has experimented with three institutional online community governance models that engage in cultural intermediation in differing decentralised capacities. The first and most closed is a single point of contact model where one cultural intermediary controls all of the communication of the participatory project. The second is a model of multiple cultural intermediaries engaging in communication between the institutional online community stakeholders simultaneously. The third is most open yet problematic as it promotes and empowers community participants to the level of cultural intermediaries. This paper uses the ABC Pool case study to highlight the differing levels of openness within cultural intermediation during the co-creative production process of a cultural artifact.
Resumo:
Exosomes have been shown to act as mediators for cell to cell communication and as a potential source of biomarkers for many diseases, including prostate cancer. Exosomes are nanosized vesicles secreted by cells and consist of proteins normally found in multivesicular bodies, RNA, DNA and lipids. As a potential source of biomarkers, exosomes have attracted considerable attention, as their protein content resembles that of their cells of origin, even though it is noted that the proteins, miRNAs and lipids found in the exosomes are not a reflective stoichiometric sampling of the contents from the parent cells. While the biogenesis of exosomes in dendritic cells and platelets has been extensively characterized, much less is known about the biogenesis of exosomes in cancer cells. An understanding of the processes involved in prostate cancer will help to further elucidate the role of exosomes and other extracellular vesicles in prostate cancer progression and metastasis. There are few methodologies available for general isolation of exosomes, however validation of those methodologies is necessary to study the role of exosomal-derived biomarkers in various diseases. In this review, we discuss “exosomes” as a member of the family of extracellular vesicles and their potential to provide candidate biomarkers for prostate cancer.
Resumo:
In response to a growing interest in art and science interactions and transdisciplinary research strategies, this research project examines the critical and conceptual affordances of ArtScience practice and outlines a new experiential methodology for practice-lead research using a framework of creative becoming. In doing so, the study contributes to the field of ArtScience and transdisciplinary practice, by providing new strategies for creative development and critical enquiry across art and science.
Resumo:
Whilst there is an excellent and growing body of literature around female criminality underpinned by feminist methodologies, the nitty gritty of the methodological journey is nowhere as well detailed as it is in the context of the Higher Degree Research (HDR) thesis. Thus the purpose of this paper is threefold: i) to explore a range of feminist methodologies underpinning 20 Australian HDR theses focussing on female criminality; ii) to identify and map the governance/ethics tensions experienced by these researchers whilst undertaking high risk research in the area of female offending; and iii) to document strategies drawn from negotiations, resolutions and outcomes to a range of gate-keeping issues. By exploring the strategies used by these researchers, this paper aims to: promote discussion on feminist methodologies; highlight pathways that may be created when negotiating the challenging process of accessing data pertinent to this relatively understudied area; contribute to a community of practice; and provide useful insights into what Mason & Stubbs (2010:16) refer to as “the open and honest reflexivity through the research process by describing the assumptions, and hiccups” for future researchers navigating governance landscapes.
Resumo:
This paper comprehensively reviews recent developments in modeling lane-changing behavior. The major lane changing models in the literature are categorized into two groups: models that aim to capture the lane changing decision-making process, and models that aim to quantify the impact of lane changing behavior on surrounding vehicles. The methodologies and important features (including their limitations) of representative models in each category are outlined and discussed. Future research needs are determined.
Resumo:
The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.
Resumo:
The majority of patients with non-small-cell lung cancer (NSCLC) present with advanced disease, with targeted therapies providing some improvement in clinical outcomes. The epidermal growth factor receptor (EGFR) tyrosine kinase (TK) plays an important role in the pathogenesis of NSCLC. Tyrosine kinase inhibitors (TKIs), which target the EGFR TK domain, have proven to be an effective treatment strategy; however, patient responses to treatment vary considerably. Therefore, the identification of patients most likely to respond to treatment is essential to optimise the benefit of TKIs. Tumour-associated activating mutations in EGFR can identify patients with NSCLC who are likely to have a good response to TKIs. Nonetheless, the majority of patients relapse within a year of starting treatment. Studies of tumours at relapse have demonstrated expression of a T790M mutation in exon 20 of the EGFR TK domain in approximately 50% of cases. Although conferring resistance to reversible TKIs, these patients may remain sensitive to new-generation irreversible/panerb inhibitors. A number of techniques have been employed for genotypic assessment of tumourassociated DNA to identify EGFR mutations, each of which has advantages and disadvantages. This review presents an overview of the current methodologies used to identify such molecular markers. Recent developments in technology may make the monitoring of changes in patients' tumour genotypes easier in clinical practice, which may enable patients' treatment regimens to be tailored during the course of their disease, potentially leading to improved patient outcomes.
Resumo:
There is a growing trend to offer students learning opportunities that are flexible, innovative and engaging. As educators embrace student-centred agile teaching and learning methodologies, which require continuous reflection and adaptation, the need to evaluate students’ learning in a timely manner has become more pressing. Conventional evaluation surveys currently dominate the evaluation landscape internationally, despite recognition that they are insufficient to effectively evaluate curriculum and teaching quality. Surveys often: (1) fail to address the issues for which educators need feedback, (2) constrain student voice, (3) have low response rates and (4) occur too late to benefit current students. Consequently, this paper explores principles of effective feedback to propose a framework for learner-focused evaluation. We apply a three-stage control model, involving feedforward, concurrent and feedback evaluation, to investigate the intersection of assessment and evaluation in agile learning environments. We conclude that learner-focused evaluation cycles can be used to guide action so that evaluation is not undertaken simply for the benefit of future offerings, but rather to benefit current students by allowing ‘real-time’ learning activities to be adapted in the moment. As a result, students become co-producers of learning and evaluation becomes a meaningful, responsive dialogue between students and their instructors.
Resumo:
In eukaryotes, numerous complex sub-cellular structures exist. The majority of these are delineated by membranes. Many proteins are trafficked to these in order to be able to carry out their correct physiological function. Assigning the sub-cellular location of a protein is of paramount importance to biologists in the elucidation of its role and in the refinement of knowledge of cellular processes by tracing certain activities to specific organelles. Membrane proteins are a key set of proteins as these form part of the boundary of the organelles and represent many important functions such as transporters, receptors, and trafficking. They are, however, some of the most challenging proteins to work with due to poor solubility, a wide concentration range within the cell and inaccessibility to many of the tools employed in proteomics studies. This review focuses on membrane proteins with particular emphasis on sub-cellular localization in terms of methodologies that can be used to determine the accurate location of membrane proteins to organelles. We also discuss what is known about the membrane protein cohorts of major organelles.
Resumo:
Age trajectories for personality traits are known to be similar across cultures. To address whether stereotypes of age groups reflect these age-related changes in personality, we asked participants in 26 countries (N = 3,323) to rate typical adolescents, adults, and old persons in their own country. Raters across nations tended to share similar beliefs about different age groups; adolescents were seen as impulsive, rebellious, undisciplined, preferring excitement and novelty, whereas old people were consistently considered lower on impulsivity, activity, antagonism, and Openness. These consensual age group stereotypes correlated strongly with published age differences on the five major dimensions of personality and most of 30 specific traits, using as criteria of accuracy both self-reports and observer ratings, different survey methodologies, and data from up to 50 nations. However, personal stereotypes were considerably less accurate, and consensual stereotypes tended to exaggerate differences across age groups.
Resumo:
While the synthesis of acting methodologies in intercultural acting has been discussed at length, little discussion has focussed on the potential of diverse actor training styles to affect performance making and audience reception. This article explores a project where the abstract elements of the British and American cultures were translated in rehearsal and in production through the purposeful juxtaposition of two differing actor training styles: the British ‘traditional’ approach and the American Method. William Nicholson’s Shadowlands was produced by Crossbow Productions at the Brisbane Powerhouse in 2010. Nicholson’s play contains a discourse on the cultural cringe of British – American relations. As a research project, the production aimed to extend and augment audience experience of the socio-cultural tensions inherent in the play by juxtaposing two seemingly culturally inscribed approaches to acting. Actors were chosen who had been trained under a traditional conservatoire approach and the American Method. A brief overview of these acting approaches is followed by a discussion centred on the project. This article analyses how from the casting room to the rehearsal room to the mise en scene and into the audience discussions, cultural issues were articulated, translated and debated through the language of acting.
Resumo:
This article examines the importance of the social evidence base in relation to the development of the law. It argues that there is a need for those lawyers who play a part in law reform (legislators and those involved in the law reform process) and for those who play a part in formulating policy-based common law rules (judges and practitioners) to know more about how facts are established in the social sciences. It argues that lawyers need sufficient knowledge and skills in order to be able to critically assess the facts and evidence base when examining new legislation and also when preparing, arguing and determining the outcomes of legal disputes. For this reason the article argues that lawyers need enhanced training in empirical methodologies in order to function effectively in modern legal contexts.
Resumo:
Over the last two decades, particularly in Australia and the UK, the doctoral landscape has changed considerably with increasingly hybridised approaches to methodologies and research strategies as well as greater choice of examinable outputs. This paper provides an overview of doctoral practices that are emerging in the creative industries context, from a predominantly Australian perspective, with a focus on practice-led approaches within the Doctor of Philosophy and recent developments in professional doctorates. The paper examines some of the diverse theoretical principles which foreground the practitioner/researcher, methodological approaches that incorporate tacit knowledge and reflective practice together with qualitative strategies, blended learning delivery modes, and flexible doctoral outputs;and how these are shaping this shifting environment towards greater research-based industry outputs. The discussion is based around a single extended case study of the Doctor of Creative Industries at Queensland University of Technology (QUT) as one model of an interdisciplinary professional research doctorate.
Resumo:
Exposure control or case-control methodologies are common techniques for estimating crash risks, however they require either observational data on control cases or exogenous exposure data, such as vehicle-kilometres travelled. This study proposes an alternative methodology for estimating crash risk of road user groups, whilst controlling for exposure under a variety of roadway, traffic and environmental factors by using readily available police-reported crash data. In particular, the proposed method employs a combination of a log-linear model and quasi-induced exposure technique to identify significant interactions among a range of roadway, environmental and traffic conditions to estimate associated crash risks. The proposed methodology is illustrated using a set of police-reported crash data from January 2004 to June 2009 on roadways in Queensland, Australia. Exposure-controlled crash risks of motorcyclists—involved in multi-vehicle crashes at intersections—were estimated under various combinations of variables like posted speed limit, intersection control type, intersection configuration, and lighting condition. Results show that the crash risk of motorcycles at three-legged intersections is high if the posted speed limits along the approaches are greater than 60 km/h. The crash risk at three-legged intersections is also high when they are unsignalized. Dark lighting conditions appear to increase the crash risk of motorcycles at signalized intersections, but the problem of night time conspicuity of motorcyclists at intersections is lessened on approaches with lower speed limits. This study demonstrates that this combined methodology is a promising tool for gaining new insights into the crash risks of road user groups, and is transferrable to other road users.
Resumo:
Product Lifecycle Management (PLM) systems are widely used in the manufacturing industry. A core feature of such systems is to provide support for versioning of product data. As workflow functionality is increasingly used in PLM systems, the possibility emerges that the versioning transitions for product objects as encapsulated in process models do not comply with the valid version control policies mandated in the objects’ actual lifecycles. In this paper we propose a solution to tackle the (non-)compliance issues between processes and object version control policies. We formally define the notion of compliance between these two artifacts in product lifecycle management and then develop a compliance checking method which employs a well-established workflow analysis technique. This forms the basis of a tool which offers automated support to the proposed approach. By applying the approach to a collection of real-life specifications in a main PLM system, we demonstrate the practical applicability of our solution to the field.