979 resultados para Tim Winton
Resumo:
Amongst the most prominent uses of Twitter at present is its role in the discussion of widely televised events: Twitter’s own statistics for 2011, for example, list major entertainment spectacles (the MTV Music Awards, the BET Awards) and sports matches (the UEFA Champions League final, the FIFA Women’s World Cup final) amongst the events generating the most tweets per second during the year (Twitter, 2011). User activities during such televised events constitute a specific, unique category of Twitter use, which differs clearly from the other major events which generate a high rate of tweets per second (such as crises and breaking news, from the Japanese earthquake and tsunami to the death of Steve Jobs), as preliminary research has shown. During such major media events, by contrast, Twitter is used most predominantly as a technology of fandom instead: it serves in the first place as a backchannel to television and other streaming audiovisual media, enabling users offer their own running commentary on the universally shared media text of the event broadcast as it unfolds live. Centrally, this communion of fans around the shared text is facilitated by the use of Twitter hashtags – unifying textual markers which are now often promoted to prospective audiences by the broadcasters well in advance of the live event itself. This paper examines the use of Twitter as a technology for the expression of shared fandom in the context of a major, internationally televised annual media event: the Eurovision Song Contest. It constitutes a highly publicised, highly choreographed media spectacle whose eventual outcomes are unknown ahead of time and attracts a diverse international audience. Our analysis draws on comprehensive datasets for the ‘official’ event hashtags, #eurovision, #esc, and #sbseurovision. Using innovative methods which combine qualitative and quantitative approaches to the analysis of Twitter datasets containing several hundreds of thousands, we examine overall patterns of participation to discover how audiences express their fandom throughout the event. Minute-by-minute tracking of Twitter activity during the live broadcasts enables us to identify the most resonant moments during each event; we also examine the networks of interaction between participants to detect thematically or geographically determined clusters of interaction, and to identify the most visible and influential participants in each network. Such analysis is able to provide a unique insight into the use of Twitter as a technology for fandom and for what in cultural studies research is called ‘audiencing’: the public performance of belonging to the distributed audience for a shared media event. Our work thus contributes to the examination of fandom practices led by Henry Jenkins (2006) and other scholars, and points to Twitter as an important new medium facilitating the connection and communion of such fans.
Resumo:
Spatio-Temporal interest points are the most popular feature representation in the field of action recognition. A variety of methods have been proposed to detect and describe local patches in video with several techniques reporting state of the art performance for action recognition. However, the reported results are obtained under different experimental settings with different datasets, making it difficult to compare the various approaches. As a result of this, we seek to comprehensively evaluate state of the art spatio- temporal features under a common evaluation framework with popular benchmark datasets (KTH, Weizmann) and more challenging datasets such as Hollywood2. The purpose of this work is to provide guidance for researchers, when selecting features for different applications with different environmental conditions. In this work we evaluate four popular descriptors (HOG, HOF, HOG/HOF, HOG3D) using a popular bag of visual features representation, and Support Vector Machines (SVM)for classification. Moreover, we provide an in-depth analysis of local feature descriptors and optimize the codebook sizes for different datasets with different descriptors. In this paper, we demonstrate that motion based features offer better performance than those that rely solely on spatial information, while features that combine both types of data are more consistent across a variety of conditions, but typically require a larger codebook for optimal performance.
Resumo:
Chlamydia pneumoniae is an enigmatic human and animal pathogen. Originally discovered in association with acute human respiratory disease, it is now associated with a remarkably wide range of chronic diseases as well as having a cosmopolitan distribution within the animal kingdom. Molecular typing studies suggest that animal strains are ancestral to human strains and that C. pneumoniae crossed from animals to humans as the result of at least one relatively recent zoonotic event. Whole genome analyses appear to support this concept – the human strains are highly conserved whereas the single animal strain that has been fully sequenced has a larger genome with several notable differences. When compared to the other, better known chlamydial species that is implicated in human infection, Chlamydia trachomatis, C. pneumoniae demonstrates pertinent differences in its cell biology, development, and genome structure. Here, we examine the characteristic facets of C. pneumoniae biology, offering insights into the diversity and evolution of this silent and ancient pathogen.
Resumo:
Advanced substation applications, such as synchrophasors and IEC 61850-9-2 sampled value process buses, depend upon highly accurate synchronizing signals for correct operation. The IEEE 1588 Precision Timing Protocol (PTP) is the recommended means of providing precise timing for future substations. This paper presents a quantitative assessment of PTP reliability using Fault Tree Analysis. Two network topologies are proposed that use grandmaster clocks with dual network connections and take advantage of the Best Master Clock Algorithm (BMCA) from IEEE 1588. The cross-connected grandmaster topology doubles reliability, and the addition of a shared third grandmaster gives a nine-fold improvement over duplicated grandmasters. The performance of BMCA mediated handover of the grandmaster role during contingencies in the timing system was evaluated experimentally. The 1 µs performance requirement of sampled values and synchrophasors are met, even during network or GPS antenna outages. Slave clocks are shown to synchronize to the backup grandmaster in response to degraded performance or loss of the main grandmaster. Slave disturbances are less than 350 ns provided the grandmaster reference clocks are not offset from one another. A clear understanding of PTP reliability and the factors that affect availability will encourage the adoption of PTP for substation time synchronization.
Resumo:
Background: The size of the carrier influences drug aerosolization from a dry powder inhaler (DPI) formulation. Lactose particles with irregular shape and rough surface in a variety of sizes are additionally used as carriers; however, contradictory reports exist regarding the effect of carrier size on the dispersion of drug. We examined the influence of the spherical particle size of the biodegradable polylactide-co-glycolide (PLGA) carrier on the aerosolization of a model drug, salbutamol sulphate (SS). Methods: Four different sizes (20-150 µm) of polymer carriers were fabricated using solvent evaporation technique and the dispersion of SS from these carriers was measured by a Twin Stage Impinger (TSI). The size and morphological properties of polymer carriers were determined by laser diffraction and SEM, respectively. Results: The FPF was found to increase from 5.6% to 21.3% with increasing carrier sizeup to150 µm. Conclusions: The aerosolization of drug increased linearly with the size of polymer carriers. For a fixed mass of drug particles in a formulation, the mass of drug particles per unit area of carriers is higher in formulations containing the larger carriers, which leads to an increase in the dispersion of drug due to the increased mechanical forces occurred between the carriers and the device walls.
Resumo:
Game jams provide design researchers with extraordinary opportunity to watch creative teams in action and recent years have seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). Re-presenting the experience in place has become the goal of the data visualisation project that has become the focus of our own curated 48hr game jam. Taking our cue from the work of Tim Ingold on embodied practice, we have now established the 48hr game making challenge as a site for data visualisation research in place.
Resumo:
The selection of optimal camera configurations (camera locations, orientations etc.) for multi-camera networks remains an unsolved problem. Previous approaches largely focus on proposing various objective functions to achieve different tasks. Most of them, however, do not generalize well to large scale networks. To tackle this, we introduce a statistical formulation of the optimal selection of camera configurations as well as propose a Trans-Dimensional Simulated Annealing (TDSA) algorithm to effectively solve the problem. We compare our approach with a state-of-the-art method based on Binary Integer Programming (BIP) and show that our approach offers similar performance on small scale problems. However, we also demonstrate the capability of our approach in dealing with large scale problems and show that our approach produces better results than 2 alternative heuristics designed to deal with the scalability issue of BIP.
Resumo:
Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.
Resumo:
Artists: Donna Hewitt, Julian Knowles, Wade Marynowsky, Tim Bruniges, Avril Huddy Macrophonics presents new Australian work emerging from the leading edge of where performance interface research is taking place. The program addresses the emerging dialogue between traditional media and emerging digital media, as well as the dialogue across a broad range of musical traditions. Due to recent technological developments, we have reached a point artistically where the relationships between media and genres are being completely re-evaluated. This program presents a cross-section of responses to this condition. Each of the works in the program foregrounds an approach to performance that integrates sensors and novel performance control devices and/or examine how machines can be made musical in performance. Containing works for voice, electronics, video, movement and sensor based gestural controllers, it critically surveys the interface between humans and machines in performance. From sensor based microphones and guitars, performance a/v, to post-rock dronescapes and experimental electronica; Macrophonics provides a broad and engaging survey of new performance approaches in mediatised environments.
Resumo:
A global, online quantitative study among 300 consumers of digital technology products found the most reliable information sources were friends, family or word of mouth (WOM) from someone they knew, followed by expert product reviews, and product reviews written by other consumers. The most unreliable information sources were advertising or infomercials, automated recommendations based on purchasing patterns or retailers. While a very small number of consumers evaluated products online, rating of products and online discussions were more frequent activities. The most popular social media websites for reviews were Facebook, Twitter, Amazon and e-Bay, indicating the importance of WOM in social networks and online media spaces that feature product reviews as it is the most persuasive piece of information in both online and offline social networks. These results suggest that ‘social customers’ must be considered as an integral part of a marketing strategy.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
Melt electrospinning in a direct writing mode is a recent additive manufacturing approach to fabricate porous scaffolds for tissue engineering applications. In this study, we describe porous and cell-invasive poly (ε-caprolactone) scaffolds fabricated by combining melt electrospinning and a programmable x–y stage. Fibers were 7.5 ± 1.6 µm in diameter and separated by interfiber distances ranging from 8 to 133 µm, with an average of 46 ± 22 µm. Micro-computed tomography revealed that the resulting scaffolds had a highly porous (87%), three-dimensional structure. Due to the high porosity and interconnectivity of the scaffolds, a top-seeding method was adequate to achieve fibroblast penetration, with cells present throughout and underneath the scaffold. This was confirmed histologically, whereby a 3D fibroblast-scaffold construct with full cellular penetration was produced after 14 days in vitro. Immunohistochemistry was used to confirm the presence and even distribution of the key dermal extracellular matrix proteins, collagen type I and fibronectin. These results show that melt electrospinning in a direct writing mode can produce cell invasive scaffolds, using simple top-seeding approaches.
Resumo:
Amphiphilic poly(ethylene glycol)-block-pol (dimethylsiloxane)-block-poly(ethylene glycol)(PEG-block-PDMS block-PEG) triblock copolymers have been successfully prepared via hydrosilylation using discrete and polydisperse PEG of various chain lengths. Facile synthesis of discrete PEG (dPEG) is achieved via systematic tosylation and etherification of lower glycols. Amphiphilicity of the dPEG block-PDMS-block-dPEG triblock copolymer is illustrated by dynamic light scattering (DLS) and measurement of the critical micelle concentration (CMC).
Resumo:
A well-engineered scaffold for regenerative medicine, which is suitable to be translated from the bench to the bedside, combines inspired design, technical innovation and precise craftsmanship. Electrospinning and additive manufacturing are separate approaches to manufacturing scaffolds for a variety of tissue engineering applications. A need to accurately control the spatial distribution of pores within scaffolds has recently resulted in combining the two processing methods, to overcome shortfalls in each technology. This review describes where electrospinning and additive manufacturing are used together to generate new porous structures for biological applications.
Resumo:
This article explores the relationship between the usage of an external accountant and family firm sales growth and survival. Using a longitudinal panel of Australian small and medium sized family enterprises, we find that external accountants have a positive impact on sales growth and survival. We also find that the degree to which the accountant is acquainted with the family and the firm’s needs, which we term as embeddedness, moderates these positive outcomes. Furthermore, we find that appropriate strategic planning processes are necessary to maximize the sales growth benefit; however, these processes are not necessary to gain the survival benefit.