981 resultados para Tim O’Brien
Resumo:
Copoly(2-oxazoline)s, prepared by cationic ring-opening polymerization of 2-(dec-9-enyl)-2-oxazoline with either 2-methyl-2-oxazoline or 2-ethyl-2-oxazoline, have been crosslinked with small dithiol molecules under UV-irradiation to form homogeneous networks. In-situ monitoring of the crosslinking reaction by photo-rheology revealed network formation within minutes. The degree of swelling in water was found to be tunable by the hydrophilicity of the starting macromers and the proportion of alkene side arms. Furthermore, degradable hydrogels have been prepared based on a hydrolytically cleavable dithiol crosslinker.
Resumo:
Given the importance of water for rice production, this study examines the factors affecting the technical efficiency (TE) of irrigated rice farmers in village irrigation systems (VIS) in Sri Lanka. Primary data were collected from 460 rice farmers in the Kurunagala District, Sri Lanka, to estimate a stochastic translog production frontier for rice production. The mean TE of rice farming in village irrigation was found to be 0.72, although 63% of rice farmers exceeded this average. The most influential factors of TE are membership of Farmer Organisations (FOs) and the participatory rate in collective actions organised by FOs. The results suggest that enhancement of co-operative arrangements of farmers by strengthening the membership of FOs is considered important for increasing TE in rice farming in VIS.
Resumo:
Pesticide spraying by farmers has an adverse impact on their health. However, in studies to date examining farmers’ exposure to pesticides, the costs of ill health and their determinants have been based on information provided by farmers themselves. Some doubt has therefore been cast on the reliability of these estimates. In this study, we address this by conducting surveys among two groups of farmers who use pesticides on a regular basis. The first group is made up of farmers who perceive that their ill health is due to exposure to pesticides and have obtained at least some form of treatment (described in this article as the ‘general farmer group’). The second group is composed of farmers whose ill health has been diagnosed by doctors and who have been treated in hospital for exposure to pesticides (described here as the ‘hospitalised farmer group’). Cost comparisons are made between the two groups of farmers. Regression analysis of the determinants of health costs show that the most important determinants of medical costs for both samples are the defensive expenditure, the quantity of pesticides used per acre per month, frequency of pesticide use and number of pesticides used per hour per day. The results have important policy implications.
Resumo:
There remains a substantial shortfall in treatment of severe skeletal injuries. The current gold standard of autologous bone grafting from the same patient, has many undesirable side effects associated such as donor site morbidity. Tissue engineering seeks to offer a solution to this problem. The primary requirements for tissue engineered scaffolds have already been well established, and many materials, such as polyesters, present themselves as potential candidates for bone defects; they have comparable structural features, but they often lack the required osteoconductivity to promote adequate bone regeneration. By combining these materials with biological growth factors; which promote the infiltration of cells into the scaffold as well as the differentiation into the specific cell and tissue type, it is possible to increase the formation of new bone. However cost and potential complications associated with growth factors means controlled release is an important consideration in the design of new bone tissue engineering strategies. This review will cover recent research in the area of encapsulation and release of growth factors within a variety of different polymeric scaffolds.
Resumo:
This paper outlines a method for studying online activity using both qualitative and quantitative methods: topical network analysis. A topical network refers to "the collection of sites commenting on a particular event or issue, and the links between them" (Highfield, Kirchhoff, & Nicolai, 2011, p. 341). The approach is a complement for the analysis of large datasets enabling the examination and comparison of different discussions as a means of improving our understanding of the uses of social media and other forms of online communication. Developed for an analysis of political blogging, the method also has wider applications for other social media websites such as Twitter.
Resumo:
Interaction design is about finding better ways for people to interact with each other through communication technologies. Interaction design involves understanding how people learn, work and play so that we can engineer better, more valuable technologies that are more appropriate to the contexts of their lives. As an academic discipline, interaction design is about the people-research that underpins these technologies. As a comparative tool for business it is about creating innovations that have market pull rather than a technology push. Many examples can be found which demonstrate the value of interaction design within both industry and academia, however finding the common ground between this spectrum of activity is often difficult. Differences in language, approach and outcomes often lead to researchers from either side of the spectrum complaining of an uncommon ground, which often results in a lack of collaboration within such projects. However, as demonstrated through this case study, rather than focussing on finding a common ground to assist in better collaboration between industry and academia, celebrating the uniqueness of each approach whilst bridging them with a common language can lead to new knowledge and commercial innovation. This case study will focus on the research and development phase of a Diversionary Therapy Platform, a collaboration between the Australasian CRC for Interaction Design and The Royal Children's Hospital (Brisbane, Australia). This collaborative effort has led to the formation of a new commercial venture, Diversionary Therapy Pty Ltd, which aims to bring to the market the research outcomes from the project. The case study will outline the collaborative research and development process undertaken between the many stakeholders and reflect on the challenges identified within this process. A key finding from this collaboration was allowing for the co-existence of the common and uncommon ground throughout the project. This concept will be discussed further throughout this paper.
Resumo:
This conference celebrates the passing of 40 years since the establishment of the Internet (dating this, presumably, to the first connection between two nodes on ARPANET in October 1969). For a gathering of media scholars such as this, however, it may be just as important not only to mark the first testing of the core technologies upon which much of our present-day Net continues to build, but also to reflect on another recent milestone: the 20th anniversary of what is today arguably the chief interface through which billions around the world access and experience the Internet – the World Wide Web, launched by Tim Berners-Lee in 1989.
Resumo:
BACKGROUND The work described in this paper has emerged from an ALTC/OLT funded project, Exploring Intercultural Competency in Engineering. The project indentified many facets of culture and intercultural competence that go beyond a culture-as-nationality paradigm. It was clear from this work that resources were needed to help engineering educators introduce students to the complex issues of culture as they relate to engineering practice. A set of learning modules focussing on intercultural competence in engineering practice were developed early on in the project. Through the OLT project, these modules have been expanded into a range of resources covering various aspects of culture in engineering. Supporting the resources, an eBook detailing the ins and outs of intercultural competency has also been developed to assist engineering educators to embed opportunities for students to develop skills in unpacking and managing cross-cultural challenges in engineering practice. PURPOSE This paper describes the key principles behind the development of the learning modules, the areas they cover and the eBook developed to support the modules. The paper is intended as an introduction to the approaches and resources and extends an invitation to the community to draw from, and contribute to this initial work. DESIGN/METHOD A key aim of this project was to go beyond the culture-as-nationality approach adopted in much of the work around intercultural competency (Deardorff, 2011). The eBook explores different dimensions of culture such as workplace culture, culture’s influence on engineering design, and culture in the classroom. The authors describe how these connect to industry practice and explore what they mean for engineering education. The packaged learning modules described here have been developed as a matrix of approaches moving from familiar known methods through complicated activities relying to some extent on expert knowledge. Some modules draw on the concept of ‘complex un-order’ as described in the ‘Cynefin domains’ proposed by Kurtz and Snowden (2003). RESULTS Several of the modules included in the eBook have already been trialled at a variety of institutions. Feedback from staff has been reassuringly positive so far. Further trials are planned for second semester 2012, and version 1 of the eBook and learning modules, Engineering Across Cultures, is due to be released in late October 2012. CONCLUSIONS The Engineering Across Cultures eBook and learning modules provide a useful and ready to employ resource to help educators tackle the complex issue of intercultural competency in engineering education. The book is by no means exhaustive, and nor are the modules, they instead provide an accessible, engineering specific guide to bringing cultural issues into the engineering classroom.
Resumo:
Amongst the most prominent uses of Twitter at present is its role in the discussion of widely televised events: Twitter’s own statistics for 2011, for example, list major entertainment spectacles (the MTV Music Awards, the BET Awards) and sports matches (the UEFA Champions League final, the FIFA Women’s World Cup final) amongst the events generating the most tweets per second during the year (Twitter, 2011). User activities during such televised events constitute a specific, unique category of Twitter use, which differs clearly from the other major events which generate a high rate of tweets per second (such as crises and breaking news, from the Japanese earthquake and tsunami to the death of Steve Jobs), as preliminary research has shown. During such major media events, by contrast, Twitter is used most predominantly as a technology of fandom instead: it serves in the first place as a backchannel to television and other streaming audiovisual media, enabling users offer their own running commentary on the universally shared media text of the event broadcast as it unfolds live. Centrally, this communion of fans around the shared text is facilitated by the use of Twitter hashtags – unifying textual markers which are now often promoted to prospective audiences by the broadcasters well in advance of the live event itself. This paper examines the use of Twitter as a technology for the expression of shared fandom in the context of a major, internationally televised annual media event: the Eurovision Song Contest. It constitutes a highly publicised, highly choreographed media spectacle whose eventual outcomes are unknown ahead of time and attracts a diverse international audience. Our analysis draws on comprehensive datasets for the ‘official’ event hashtags, #eurovision, #esc, and #sbseurovision. Using innovative methods which combine qualitative and quantitative approaches to the analysis of Twitter datasets containing several hundreds of thousands, we examine overall patterns of participation to discover how audiences express their fandom throughout the event. Minute-by-minute tracking of Twitter activity during the live broadcasts enables us to identify the most resonant moments during each event; we also examine the networks of interaction between participants to detect thematically or geographically determined clusters of interaction, and to identify the most visible and influential participants in each network. Such analysis is able to provide a unique insight into the use of Twitter as a technology for fandom and for what in cultural studies research is called ‘audiencing’: the public performance of belonging to the distributed audience for a shared media event. Our work thus contributes to the examination of fandom practices led by Henry Jenkins (2006) and other scholars, and points to Twitter as an important new medium facilitating the connection and communion of such fans.
Resumo:
Spatio-Temporal interest points are the most popular feature representation in the field of action recognition. A variety of methods have been proposed to detect and describe local patches in video with several techniques reporting state of the art performance for action recognition. However, the reported results are obtained under different experimental settings with different datasets, making it difficult to compare the various approaches. As a result of this, we seek to comprehensively evaluate state of the art spatio- temporal features under a common evaluation framework with popular benchmark datasets (KTH, Weizmann) and more challenging datasets such as Hollywood2. The purpose of this work is to provide guidance for researchers, when selecting features for different applications with different environmental conditions. In this work we evaluate four popular descriptors (HOG, HOF, HOG/HOF, HOG3D) using a popular bag of visual features representation, and Support Vector Machines (SVM)for classification. Moreover, we provide an in-depth analysis of local feature descriptors and optimize the codebook sizes for different datasets with different descriptors. In this paper, we demonstrate that motion based features offer better performance than those that rely solely on spatial information, while features that combine both types of data are more consistent across a variety of conditions, but typically require a larger codebook for optimal performance.
Resumo:
Chlamydia pneumoniae is an enigmatic human and animal pathogen. Originally discovered in association with acute human respiratory disease, it is now associated with a remarkably wide range of chronic diseases as well as having a cosmopolitan distribution within the animal kingdom. Molecular typing studies suggest that animal strains are ancestral to human strains and that C. pneumoniae crossed from animals to humans as the result of at least one relatively recent zoonotic event. Whole genome analyses appear to support this concept – the human strains are highly conserved whereas the single animal strain that has been fully sequenced has a larger genome with several notable differences. When compared to the other, better known chlamydial species that is implicated in human infection, Chlamydia trachomatis, C. pneumoniae demonstrates pertinent differences in its cell biology, development, and genome structure. Here, we examine the characteristic facets of C. pneumoniae biology, offering insights into the diversity and evolution of this silent and ancient pathogen.
Resumo:
Advanced substation applications, such as synchrophasors and IEC 61850-9-2 sampled value process buses, depend upon highly accurate synchronizing signals for correct operation. The IEEE 1588 Precision Timing Protocol (PTP) is the recommended means of providing precise timing for future substations. This paper presents a quantitative assessment of PTP reliability using Fault Tree Analysis. Two network topologies are proposed that use grandmaster clocks with dual network connections and take advantage of the Best Master Clock Algorithm (BMCA) from IEEE 1588. The cross-connected grandmaster topology doubles reliability, and the addition of a shared third grandmaster gives a nine-fold improvement over duplicated grandmasters. The performance of BMCA mediated handover of the grandmaster role during contingencies in the timing system was evaluated experimentally. The 1 µs performance requirement of sampled values and synchrophasors are met, even during network or GPS antenna outages. Slave clocks are shown to synchronize to the backup grandmaster in response to degraded performance or loss of the main grandmaster. Slave disturbances are less than 350 ns provided the grandmaster reference clocks are not offset from one another. A clear understanding of PTP reliability and the factors that affect availability will encourage the adoption of PTP for substation time synchronization.
Resumo:
During the last four decades, educators have created a range of critical literacy approaches for different contexts, including compulsory schooling (Luke & Woods, 2009) and second language education (Luke & Dooley, 2011). Despite inspirational examples of critical work with young students (e.g., O’Brien, 1994; Vasquez, 1994), Comber (2012) laments the persistent myth that critical literacy is not viable in the early years. Assumptions about childhood innocence and the priorities of the back-to-basics movement seem to limit the possibilities for early years literacy teaching and learning. Yet, teachers of young students need not face an either/or choice between the basic and critical dimensions of literacy. Systematic ways of treating literacy in all its complexity exist. We argue that the integrative imperative is especially important in schools that are under pressure to improve technical literacy outcomes. In this chapter, we document how critical literacy was addressed in a fairytales unit taught to 4.5 - 5.5 year olds in a high diversity, high poverty Australian school. We analyze the affordances and challenges of different approaches to critical literacy, concluding they are complementary rather than competing sources of possibility. Furthermore, we make the case for turning familiar classroom activities to critical ends.
Resumo:
Background: The size of the carrier influences drug aerosolization from a dry powder inhaler (DPI) formulation. Lactose particles with irregular shape and rough surface in a variety of sizes are additionally used as carriers; however, contradictory reports exist regarding the effect of carrier size on the dispersion of drug. We examined the influence of the spherical particle size of the biodegradable polylactide-co-glycolide (PLGA) carrier on the aerosolization of a model drug, salbutamol sulphate (SS). Methods: Four different sizes (20-150 µm) of polymer carriers were fabricated using solvent evaporation technique and the dispersion of SS from these carriers was measured by a Twin Stage Impinger (TSI). The size and morphological properties of polymer carriers were determined by laser diffraction and SEM, respectively. Results: The FPF was found to increase from 5.6% to 21.3% with increasing carrier sizeup to150 µm. Conclusions: The aerosolization of drug increased linearly with the size of polymer carriers. For a fixed mass of drug particles in a formulation, the mass of drug particles per unit area of carriers is higher in formulations containing the larger carriers, which leads to an increase in the dispersion of drug due to the increased mechanical forces occurred between the carriers and the device walls.
Resumo:
Game jams provide design researchers with extraordinary opportunity to watch creative teams in action and recent years have seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). Re-presenting the experience in place has become the goal of the data visualisation project that has become the focus of our own curated 48hr game jam. Taking our cue from the work of Tim Ingold on embodied practice, we have now established the 48hr game making challenge as a site for data visualisation research in place.