945 resultados para user-created content


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decision of Baldwin v Icon Energy Ltd [2015] QSC 12 is generally instructive upon the issue of the minimum required to enforce an agreement to negotiate .The language of these agreements is always couched in terms which include the expressions “good faith” and “reasonable endeavours” as descriptive of the yardstick of behaviour of each party in the intended negotiation to follow such an agreement. However, the mere statement of these intended characteristics of negotiation may not be sufficient to ensure that the agreement to negotiate is enforceable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, drug use has been understood as a problem of epidemiology, psychiatry, physiology, and criminality requiring legal and medical governance. Consequently drug research tends to be underpinned by an imperative to better govern, and typically proposes policy interventions to prevent or solve drug problems. We argue that categories of ‘addictive’ and ‘recreational’ drug use are discursive forms of governance that are historically, politically and socially contingent. These constructions of the drug problem shape what drug users believe about themselves and how they enact these beliefs in their drug use practices. Based on qualitative interviews with young illicit drug users in Brisbane, Australia, this paper uses Michel Foucault’s concept of governmentality to provide insights into how the governance of illicit drugs intersects with self-governance to create a drug user self. We propose a reconceptualisation of illicit drug use that takes into account the contingencies and subjective factors that shape the drug experience. This allows for an understanding of the relationships between discourses, policies, and practices in constructions of illicit drug users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual working environments are intrinsic to the contemporary workplace and collaborative skills are a vital graduate capability. To develop students’ collaborative skills, first year medical laboratory science students undertake a group poster project, based on a blended learning model. Learning is scaffolded in lectures, workshops in collaborative learning spaces, practitioner mentoring sessions, and online resources. Google Drive provides an online collaborative space for students to realise tangible outcomes from this learning. A Google Drive document is created for each group and shared with members. In this space, students assign tasks and plan workflow, share research, progressively develop poster content, reflect and comment on peer contributions and use the messaging functions to ‘talk’ to group members. This provides a readily accessible, transparent record of group work, crucial in peer assessment, and a communication channel for group members and the lecturer, who can support groups if required. This knowledge creation space also augments productivity and effectiveness of face-to-face collaboration. As members are randomly allocated to groups and are often of diverse backgrounds and unknown to each other, resilience is built as students navigate the uncertainties and complexities of group dynamics, learning to focus on the goal of the team task as they constructively and professionally engage in team dialogue. Students are responsible and accountable for individual and group work. The use of Google Drive was evaluated in a survey including Likert scale and open ended qualitative questions. Statistical analysis was carried out. Results show students (79%) valued the inclusion of online space in collaborative work and highly appreciated (78%) the flexibility provided by Google Drive, while recognising the need for improved notification functionality. Teaching staff recognised the advantages in monitoring and moderating collaborative group work, and the transformational progression in student collaborative as well as technological skill acquisition, including professional dialogue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Exposure to air pollutants, including diesel particulate matter, has been linked to adverse respiratory health effects. Inhaled diesel particulate matter contains adsorbed organic compounds. It is not clear whether the adsorbed organics or the residual components are more deleterious to airway cells. Using a physiologically relevant model, we investigated the role of diesel organic content on mediating cellular responses of primary human bronchial epithelial cells (HBECs) cultured at an air-liquid interface (ALI). Methods Primary HBECs were cultured and differentiated at ALI for at least 28 days. To determine which component is most harmful, we compared primary HBEC responses elicited by residual (with organics removed) diesel emissions (DE) to those elicited by neat (unmodified) DE for 30 and 60 minutes at ALI, with cigarette smoke condensate (CSC) as the positive control, and filtered air as negative control. Cell viability (WST-1 cell proliferation assay), inflammation (TNF-α, IL-6 and IL-8 ELISA) and changes in gene expression (qRT-PCR for HO-1, CYP1A1, TNF-α and IL-8 mRNA) were measured. Results Immunofluorescence and cytological staining confirmed the mucociliary phenotype of primary HBECs differentiated at ALI. Neat DE caused a comparable reduction in cell viability at 30 or 60 min exposures, whereas residual DE caused a greater reduction at 60 min. When corrected for cell viability, cytokine protein secretion for TNF-α, IL-6 and IL-8 were maximal with residual DE at 60 min. mRNA expression for HO-1, CYP1A1, TNF-α and IL-8 was not significantly different between exposures. Conclusion This study provides new insights into epithelial cell responses to diesel emissions using a physiologically relevant aerosol exposure model. Both the organic content and residual components of diesel emissions play an important role in determining bronchial epithelial cell response in vitro. Future studies should be directed at testing potentially useful interventions against the adverse health effects of air pollution exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pervasive use of the World Wide Web by the general population has created a cultural shift in “our living world”. It has enabled more people to share more information about more events and issues in the world than was possible before its general use. As a consequence, it has transformed traditional news media’s approach to almost every aspect of journalism, with many organisations restructuring their philosophy and practice to include a variety of participatory spaces/forums where people are free to engage in deliberative dialogue about matters of public importance. Moreover, while news media were the traditional gatekeepers of information, today many organisations allow, to different degrees, the general public and other independent journalism entities to participate in the news production process, which may include agenda setting and content production. This paper draws from an international collective case study that showcases various approaches to networked online news journalism. It examines the ways in which different traditional news media models use digital tools and technologies for participatory communication of information about matters of public interest. The research finds differences between the ways in which public service, commercial and independent news media give voice to the public and ultimately their approach to journalism’s role as the Fourth Estate––one of the key institutions of democracy. The work is framed by the notion that journalism in democratic societies has a key role in ensuring citizens are informed and engaged with public affairs. An examination of four media models, the British Broadcasting Corporation (BBC), the Guardian, News Limited and OhmyNews, showcases the various approaches to networked online news journalism and how each provides different avenues for citizen empowerment. The cases are described and analysed in the context of their own social, political and economic setting. Semi-structured in-depth interviews with key senior journalists and editors provide specific information on comparisons between the distinctive practices of their own organisation. In particular these show how the ideal of democracy can be used as a tool of persuasion as much as a method of deliberation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the effects of experience on the intuitiveness of physical and visual interactions performed by airport security screeners. Using portable eye tracking glasses, 40 security screeners were observed in the field as they performed search, examination and interface interactions during airport security x-ray screening. Data from semi structured interviews was used to further explore the nature of visual and physical interactions. Results show there are positive relationships between experience and the intuitiveness of visual and physical interactions performed by security screeners. As experience is gained, security screeners are found to perform search, examination and interface interactions more intuitively. In addition to experience, results suggest that intuitiveness is affected by the nature and modality of activities performed. This inference was made based on the dominant processing styles associated with search and examination activities. The paper concludes by discussing the implications that this research has for the design of visual and physical interfaces. We recommend designing interfaces that build on users’ already established intuitive processes, and that reduce the cognitive load incurred during transitions between visual and physical interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: Whether the action of estrogen in skeletal development depends on estrogen receptor α as encoded by the ESR1 gene is unknown. Objectives: The aim of this study was to establish whether the gain in area-adjusted bone mineral content (ABMC) in girls occurs in late puberty and to examine whether the magnitude of this gain is related to ESR1 polymorphisms. Design: We conducted a cross-sectional analysis. Setting: The study involved the Avon Longitudinal Study of Parents and Children (ALSPAC), a population-based prospective study. Participants: Participants included 3097 11-yr-olds with DNA samples, dual x-ray absorptiometry measurements, and pubertal stage information. Outcomes: Outcome measures included separate prespecified analyses in boys and girls of the relationship between ABMC derived from total body dual x-ray absorptiometry scans and Tanner stage and of the interaction between ABMC, Tanner stage, and ESR1 polymorphisms. Results: Total body less head and spinal ABMC were higher in girls in Tanner stages 4 and 5, compared with those in Tanner stages 1, 2, and 3. In contrast, height increased throughout puberty. No differences were observed in ABMC according to Tanner stage in boys. For rs2234693 (PvuII) and rs9340799 (XbaI) polymorphisms, differences in spinal ABMC in late puberty were 2-fold greater in girls who were homozygous for the C and G alleles, respectively (P = 0.001). For rs7757956, the difference in total body less head ABMC in late puberty was 50% less in individuals homozygous or heterozygous for the A allele (P = 0.006). Conclusions: Gains in ABMC in late pubertal girls are strongly associated with ESR1 polymorphisms, suggesting that estrogen contributes to this process via an estrogen receptor α-dependent pathway.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines the significance of crowdfunding for Australian filmmakers and provides an empirical basis to current claims about the role of crowdfunding in the film production and policy sectors. It has found that crowdfunding is a small but growing source of supplementary finance which is opening up new possibilities for Australian independent screen content producers. This project also highlights the discussion within Australian film policy circles that is opening the way for crowdfunding to potentially become a larger and more formalised component of current and emerging policy initiatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Total tRNAs isolated from chloroplasts and etioplasts of cucumber cotyledons were compared with respect toamino acid acceptance, isoacceptor distribution and extent of modification. Aminoacylation of the tRNAs with nine different amino acids studied indicated that the relative acceptor activities of chloroplast total tRNAs for four amino acids are significantly higher than etioplast total tRNAs. Two dimensional polyacrylamide gel electrophoresis(2D-PAGE) of chloroplast total tRNAs separated at least 32 spots, while approximately 41 spots were resolved from etioplast total tRNAs. Comparison of the reversed-phase chromatography (RPC-5) profiles of chloroplast and etioplast leucyl-, lysyl-, phenylalanyl-, and valyl-tRNA species showed no qualitative differences in the elution profiles. However, leucyl-, lysyl- and valyl-tRNA species showed quantitative differences in the relative amounts of the isoaccepting species present in chloroplasts and etioplasts. The analysis of modified nucleotides of total tRNAs from the two plastid types indicated that total tRNA from etioplasts was undermodified with respect to ribothymidine, isopentenyladenosine/hydroxy-isopentenyladenosine, 1 -methylguanosine and 2-o-methylguanosine. This indicates that illumination may cause de novo synthesis of chloroplast tRNAmodifying enzymes encoded for by nuclear genes leading to the formation of highly modified tRNAs in chloroplasts. Based on these results, we speculate that the observed decrease in levels of aminoacylation, variations in the relative amounts of certain isoacceptors, and differences in the electrophoretic mobilities of some extra tRNA spots in the etioplast total tRNAs as compared to chloroplast total tRNAs could be due to some partially undermodified etioplast tRNAs. Taken together, the data suggested that the light-induced transformation of etioplasts into chloroplasts is accompanied by increases in the relative levels of some functional chloroplast tRNAs by post transcriptional nucleotide modifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the fabrication of assembled nanostructures from the pre-synthesized nanocrystals building blocks through optical means of exciton formation and dissociation. We demonstrate that Li (x) CoO2 nanocrystals assemble to an acicular architecture, upon prolonged exposure to ultraviolet-visible radiation emitted from a 125 W mercury vapor lamp, through intermediate excitation of excitons. The results obtained in the present study clearly show how nanocrystals of various materials with band gaps appropriate for excitations of excitons at given optical wavelengths can be assembled to unusual nanoarchitectures through illumination with incoherent light sources. The disappearance of exciton bands due to Li (x) CoO2 phase in the optical spectrum of the irradiated film comprising acicular structure is consistent with the proposed mechanism of exciton dissociation in the observed light-induced assembly process. The assembly process occurs through attractive Coulomb interactions between charged dots created upon exciton dissociation. Our work presents a new type of nanocrystal assembly process that is driven by light and exciton directed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A series of dual-phase (DP) steels containing finely dispersed martensite with different volume fractions of martensite (V-m) were produced by intermediate quenching of a boron- and vanadium-containing microalloyed steel. The volume fraction of martensite was varied from 0.3 to 0.8 by changing the intercritical annealing temperature. The tensile and impact properties of these steels were studied and compared to those of step-quenched steels, which showed banded microstructures. The experimental results show that DP steels with finely dispersed microstructures have excellent mechanical properties, including high impact toughness values, with an optimum in properties obtained at similar to 0.55 V-m. A further increase in V-m was found to decrease the yield and tensile strengths as well as the impact properties. It was shown that models developed on the basis of a rule of mixtures are inadequate in capturing the tensile properties of DP steels with V-m > 0.55. Jaoul-Crussard analyses of the work-hardening behavior of the high-martensite volume fraction DP steels show three distinct stages of plastic deformation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – Business models to date have remained the creation of management, however, it is the belief of the authors that designers should be critically approaching, challenging and creating new business models as part of their practice. This belief portrays a new era where business model constructs become the new design brief of the future and fuel design and innovation to work together at the strategic level of an organisation. Design/methodology/approach – The purpose of this paper is to explore and investigate business model design. The research followed a deductive structured qualitative content analysis approach utilizing a predetermined categorization matrix. The analysis of forty business cases uncovered commonalities of key strategic drivers behind these innovative business models. Findings – Five business model typologies were derived from this content analysis, from which quick prototypes of new business models can be created. Research limitations/implications – Implications from this research suggest there is no “one right” model, but rather through experimentation, the generation of many unique and diverse concepts can result in greater possibilities for future innovation and sustained competitive advantage. Originality/value – This paper builds upon the emerging research and exploration into the importance and relevance of dynamic, design-driven approaches to the creation of innovative business models. These models aim to synthesize knowledge gained from real world examples into a tangible, accessible and provoking framework that provide new prototyping templates to aid the process of business model experimentation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is a comparative case study in Japanese video game localization for the video games Sairen, Sairen 2 and Sairen Nyûtoransurêshon, and English-language localized versions of the same games as published in Scandinavia and Australia/New Zealand. All games are developed by Sony Computer Entertainment Inc. and published exclusively for Playstation2 and Playstation3 consoles. The fictional world of the Sairen games draws much influence from Japanese history, as well as from popular and contemporary culture, and in doing so caters mainly to a Japanese audience. For localization, i.e. the adaptation of a product to make it accessible to users outside the original market it was intended for in the first place, this is a challenging issue. Video games are media of entertainment, and therefore localization practice must preserve the games’ effects on the players’ emotions. Further, video games are digital products that are comprised of a multitude of distinct elements, some of which are part of the game world, while others regulate the connection between the player as part of the real world and the game as digital medium. As a result, video game localization is also a practice that has to cope with the technical restrictions that are inherent to the medium. The main theory used throughout the thesis is Anthony Pym’s framework for localization studies that considers the user of the localized product as a defining part of the localization process. This concept presupposes that localization is an adaptation that is performed to make a product better suited for use during a specific reception situation. Pym also addresses the factor that certain products may resist distribution into certain reception situations because of their content, and that certain aspects of localization aim to reduce this resistance through significant alterations of the original product. While Pym developed his ideas with mainly regular software in mind, they can also be adapted well to study video games from a localization angle. Since modern video games are highly complex entities that often switch between interactive and non-interactive modes, Pym’s ideas are adapted throughout the thesis to suit the particular elements being studied. Instances analyzed in this thesis include menu screens, video clips, in-game action and websites. The main research questions focus on how the games’ rules influence localization, and how the games’ fictional domain influences localization. Because there are so many peculiarities inherent to the medium of the video game, other theories are introduced as well to complement the research at hand. These include Lawrence Venuti’s discussions of foreiginizing and domesticating translation methods for literary translation, and Jesper Juul’s definition of games. Additionally, knowledge gathered from interviews with video game localization professionals in Japan during September and October 2009 is also utilized for this study. Apart from answering the aforementioned research questions, one of this thesis’ aims is to enrich the still rather small field of game localization studies, and the study of Japanese video games in particular, one of Japan’s most successful cultural exports.