241 resultados para Central point
Resumo:
The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Purpose: To determine the distribution of peripheral refraction, including astigmatism, in 7- and 14-year-old Chinese children. Methods: 2134 7-year-old and 1780 14-year-old children were measured with cycloplegic central and horizontal peripheral refraction (15° and 30° at temporal and nasal visual fields). Results: 7- and 14-year-old children included 9 and 594, respectively, with moderate and high myopia (≤−3.0 D), 259 and 831 with low myopia (−2.99 to −0.5 D), 1207 and 305 with emmetropia (−0.49 to +1.0 D), and 659 and 50 with hyperopia (>1.0 D), respectively. Myopic children had relative peripheral hyperopia while hyperopic and emmetropic children had relative peripheral myopia, with greater changes in relative peripheral refraction occurring in the nasal than the temporal visual field. The older group had the greater relative peripheral hyperopia and higher peripheral J180. Both age groups showed positive slopes of J45 across the visual field, with greater slopes in the older group. Conclusions: Myopic children in mainland China have relative peripheral hyperopia while hyperopic and emmetropic children have relative peripheral myopia. Significant differences exist between 7- and 14-year-old children, with the latter showing more relative peripheral hyperopia, greater rate of change in J45 across the visual field, and higher peripheral J180.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.
Resumo:
It has been argued that transition points in life, such as the approach towards, and early years of retirement present key opportunities for interventions to improve the health of the population. Research has also highlighted inequalities in health status in the retired population and in response to interventions which should be addressed. We aimed to conduct a systematic review to synthesise international evidence on the types and effectiveness of interventions to increase physical activity among people around the time of retirement. A systematic review of literature was carried out between February 2014 and April 2015. Searches were not limited by language or location, but were restricted by date to studies published from 1990 onwards. Methods for identification of relevant studies included electronic database searching, reference list checking, and citation searching. Systematic search of the literature identified 104 papers which described study populations as being older adults. However, we found only one paper which specifically referred to their participants as being around the time of retirement. The intervention approaches for older adults encompassed: training of health care professionals; counselling and advice giving; group sessions; individual training sessions; in-home exercise programmes; in-home computer-delivered programmes; in-home telephone support; in-home diet and exercise programmes; and community-wide initiatives. The majority of papers reported some intervention effect, with evidence of positive outcomes for all types of programmes. A wide range of different measures were used to evaluate effectiveness, many were self-reported and few studies included evaluation of sedentary time. While the retirement transition is considered a significant point of life change, little research has been conducted to assess whether physical activity interventions at this time may be effective in promoting or maintaining activity, or reducing health inequalities. We were unable to find any evidence that the transition to retirement period was, or was not a significant point for intervention. Studies in older adults more generally indicated that a range of interventions might be effective for people around retirement age.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
News Corp’s only substantial competitor in the print journalism sector may be on the brink of giving up the ghost, as has long been speculated even by its natural supporters such as Beecher. If it does, hundreds more jobs will go, along with the many hundreds of experienced, skilled journalists and editors already shown the door by the company.
Resumo:
The Western European house mouse, Mus musculus domesticus, is well-known for the high frequency of Robertsonian fusions that have rapidly produced more than 50 karyotipic races, making it an ideal model for studying the mechanisms of chromosomal speciation. The mouse mandible is one of the traits studied most intensively to investigate the effect of Robertsonian fusions on phenotypic variation within and between populations. This complex bone structure has also been widely used to study the level of integration between different morphogenetic units. Here, with the aim of testing the effect of different karyotypic assets on the morphology of the mouse mandible and on its level of modularity, we performed morphometric analyses of mice from a contact area between two highly metacentric races in Central Italy. We found no difference in size, while the mandible shape was found to be different between the two Robertsonian races, even after accounting for the genetic relationships among individuals and geographic proximity. Our results support the existence of two modules that indicate a certain degree of evolutionary independence, but no difference in the strength of modularity between chromosomal races. Moreover, the ascending ramus showed more pronounced interpopulation/race phenotypic differences than the alveolar region, an effect that could be associated to their different polygenic architecture. This study suggests that chromosomal rearrangements play a role in the house mouse phenotypic divergence, and that the two modules of the mouse mandible are differentially affected by environmental factors and genetic makeup.
A combination of local inflammation and central memory T cells potentiates immunotherapy in the skin
Resumo:
Adoptive T cell therapy uses the specificity of the adaptive immune system to target cancer and virally infected cells. Yet the mechanism and means by which to enhance T cell function are incompletely described, especially in the skin. In this study, we use a murine model of immunotherapy to optimize cell-mediated immunity in the skin. We show that in vitro - derived central but not effector memory-like T cells bring about rapid regression of skin-expressing cognate Ag as a transgene in keratinocytes. Local inflammation induced by the TLR7 receptor agonist imiquimod subtly yet reproducibly decreases time to skin graft rejection elicited by central but not effector memory T cells in an immunodeficient mouse model. Local CCL4, a chemokine liberated by TLR7 agonism, similarly enhances central memory T cell function. In this model, IL-2 facilitates the development in vivo of effector function from central memory but not effector memory T cells. In a model of T cell tolerogenesis, we further show that adoptively transferred central but not effector memory T cells can give rise to successful cutaneous immunity, which is dependent on a local inflammatory cue in the target tissue at the time of adoptive T cell transfer. Thus, adoptive T cell therapy efficacy can be enhanced if CD8+ T cells with a central memory T cell phenotype are transferred, and IL-2 is present with contemporaneous local inflammation. Copyright © 2012 by The American Association of Immunologists, Inc.
Resumo:
Background Diabetic foot complications are the leading cause of lower extremity amputation and diabetes-related hospitalisation in Australia. Studies demonstrate significant reductions in amputations and hospitalisation when health professionals implement best practice management. Whilst other nations have surveyed health professionals on specific diabetic foot management, to the best of the authors’ knowledge this appears not to have occurred in Australia. The primary aim of this study was to examine Australian podiatrists’ diabetic foot management compared with best practice recommendations by the Australian National Health Medical Research Council. Methods A 36-item Australian Diabetic Foot Management survey, employing seven-point Likert scales (0 = Never; 7 = Always) to measure multiple aspects of best practice diabetic foot management was developed. The survey was briefly tested for face and content validity. The survey was electronically distributed to Australian podiatrists via professional associations. Demographics including sex, years treating patients with diabetes, employment-sector and patient numbers were also collected. Chi-squared and Mann Whitney U tests were used to test differences between sub-groups. Results Three hundred and eleven podiatrists responded; 222 (71%) were female, 158 (51%) from the public sector and 11–15 years median experience. Participants reported treating a median of 21–30 diabetes patients each week, including 1–5 with foot ulcers. Overall, participants registered median scores of at least “very often” (>6) in their use of most items covering best practice diabetic foot management. Notable exceptions were: “never” (1 (1 – 3)) using total contact casting, “sometimes” (4 (2 – 5)) performing an ankle brachial index, “sometimes” (4 (1 – 6)) using University of Texas Wound Classification System, and “sometimes” (4 (3 – 6) referring to specialist multi-disciplinary foot teams. Public sector podiatrists reported higher use or access on all those items compared to private sector podiatrists (p < 0.01). Conclusions This study provides the first baseline information on Australian podiatrists’ adherence to best practice diabetic foot guidelines. It appears podiatrists manage large caseloads of people with diabetes and are generally implementing best practice guidelines recommendations with some notable exceptions. Further studies are required to identify barriers to implementing these recommendations to ensure all Australians with diabetes have access to best practice care to prevent amputations.
Resumo:
Background Best practice clinical health care is widely recognised to be founded on evidence based practice. Enhancing evidence based practice via the rapid translation of new evidence into every day clinical practice is fundamental to the success of health care and in turn health care professions. There is little known about the collective research capacity and culture of the podiatry profession across Australia. Thus, the aim of this study was to investigate the research capacity and culture of the podiatry profession within Australia and determine if there were any differences between podiatrists working in different health sectors and workplaces. Method All registered podiatrists were eligible to participate in a cross-sectional online survey. The Australian Podiatry Associations disseminated the survey and all podiatrists were encouraged to distribute it to colleagues. The Research Capacity and Culture (RCC) tool was used to collect all research capacity and culture item variables using a 10-point scale (1 = lowest; 10 = highest). Additional demographic, workplace and health sector data variables were also collected. Mann–Whitney-U, Kruskal–Wallis and logistic regression analyses were used to determine any difference between health sectors and workplaces. Word cloud analysis was used for qualitative responses of individual motivators and barriers to research culture. Results There were 232 fully completed surveys (6% of Australian registered podiatrists). Overall respondents reported low success or skills (Median rating < 4) on the majority of individual success or skill items. Podiatrists working in multi-practitioner workplaces reported higher individual success or skills in the majority of items compared with sole practitioners (p < 0.05). Non-clinical and public health sector podiatrists reported significantly higher post-graduate study enrolment or completion, research activity participation, provisions to undertake research and individual success or skill than those working privately. Conclusions This study suggests that podiatrists in Australia report similar low levels of research success or skill to those reported in other allied health professions. The workplace setting and health sector seem to play key roles in self reported research success and skills. This is important knowledge for podiatrists and researchers aiming to translate research evidence into clinical practice.
Resumo:
Background Foot complications have been found to affect large proportions of hospital in patients with diabetes. However, no studies have investigated the proportion of foot complications affecting all people in general inpatient populations. The aims of this cross-sectional study were to investigate the point-prevalence of different foot complications in general inpatient populations, analyse differences in diabetes and non-diabetes sub-groups, and examine characteristics of people primarily admitted for a foot complication. Methods Eligible participants were all adults admitted overnight, for any reason, into five diverse hospitals on one day; excluding maternity, mental health and cognitively impaired patients. All participants underwent a physical foot examination, by trained podiatrists using validated measures, to clinically diagnose different foot complications; including foot wounds, infections, deformity, peripheral arterial disease (PAD) and peripheral neuropathy (PN). Data were also collected on participants' primary reason for admission and a range of demographic, social determinant, medical history, foot complication history, self-care and footwear risk factors. Results Overall, 733 participants consented (83% of eligible participants); mean(±SD) age 62(±19) years, 480 (55.8%) male and 172 (23.5%) had diabetes. Foot complication prevalence included: wounds 9.0% (95% CI) (5.1-8.7), infections 3.3% (2.2-4.9), deformity 22.4% (19.5-26.7), PAD 21.0% (18.2-24.1) and PN 22.0% (19.1-25.1). Diabetes populations had significantly more foot complications than non-diabetes (p < 0.01); wounds (15.7% vs 7.0%), infections (7.1% vs 2.2%), deformity (30.5% vs 19.9%), PAD (35.1% vs 16.7%) and PN (43.3% vs 15.4%). Foot complications were the primary reason for admission in 7.4% (95% CI) (5.7-9.5) of all participants. In a backwards stepwise multivariate analysis having a foot complication as the primary reason for admission was independently associated (OR (95% CI) with foot wounds (18.9 (7.3-48.7)), foot infections (6.0 (1.6-22.4)), history of amputation (4.7 (1.3-17.0) and PAD (2.9 (1.3-6.6)). Conclusions Findings of this study indicate one in every ten hospital inpatients had an active foot wound or infection. In patients with diabetes had significantly higher proportions of foot complications than non-diabetes inpatients. Remarkably one in every thirteen inpatients in this study were primarily hospitalised for a foot complication. Further research and policy is required to tackle this seemingly large inpatient foot complication burden.
Multi-GNSS precise point positioning with raw single-frequency and dual-frequency measurement models
Resumo:
The emergence of multiple satellite navigation systems, including BDS, Galileo, modernized GPS, and GLONASS, brings great opportunities and challenges for precise point positioning (PPP). We study the contributions of various GNSS combinations to PPP performance based on undifferenced or raw observations, in which the signal delays and ionospheric delays must be considered. A priori ionospheric knowledge, such as regional or global corrections, strengthens the estimation of ionospheric delay parameters. The undifferenced models are generally more suitable for single-, dual-, or multi-frequency data processing for single or combined GNSS constellations. Another advantage over ionospheric-free PPP models is that undifferenced models avoid noise amplification by linear combinations. Extensive performance evaluations are conducted with multi-GNSS data sets collected from 105 MGEX stations in July 2014. Dual-frequency PPP results from each single constellation show that the convergence time of undifferenced PPP solution is usually shorter than that of ionospheric-free PPP solutions, while the positioning accuracy of undifferenced PPP shows more improvement for the GLONASS system. In addition, the GLONASS undifferenced PPP results demonstrate performance advantages in high latitude areas, while this impact is less obvious in the GPS/GLONASS combined configuration. The results have also indicated that the BDS GEO satellites have negative impacts on the undifferenced PPP performance given the current “poor” orbit and clock knowledge of GEO satellites. More generally, the multi-GNSS undifferenced PPP results have shown improvements in the convergence time by more than 60 % in both the single- and dual-frequency PPP results, while the positioning accuracy after convergence indicates no significant improvements for the dual-frequency PPP solutions, but an improvement of about 25 % on average for the single-frequency PPP solutions.
Resumo:
This article examines some of the ways in which Australia’s First Peoples have responded to serious community health concerns about alcohol through the medium of popular music. The writing, performing and recording of popular songs about alcohol provide an important example of community-led responses to health issues, and the effectiveness of music in communicating stories and messages about alcohol has been recognised through various government-funded recording projects. This article describes some of these issues in remote Australian Aboriginal communities, exploring a number of complexities that arise through arts-based ‘instrumentalist’ approaches to social and health issues. It draws on the author’s own experience and collaborative work with Aboriginal musicians in Tennant Creek, a remote town in Australia’s Northern Territory.
Resumo:
Background There is a strong link between antibiotic consumption and the rate of antibiotic resistance. In Australia, the vast majority of antibiotics are prescribed by general practitioners, and the most common indication is for acute respiratory infections. The aim of this study is to assess if implementing a package of integrated, multifaceted interventions reduces antibiotic prescribing for acute respiratory infections in general practice. Methods/design This is a cluster randomised trial comparing two parallel groups of general practitioners in 28 urban general practices in Queensland, Australia: 14 intervention and 14 control practices. The protocol was peer-reviewed by content experts who were nominated by the funding organization. This study evaluates an integrated, multifaceted evidence-based package of interventions implemented over a six month period. The included interventions, which have previously been demonstrated to be effective at reducing antibiotic prescribing for acute respiratory infections, are: delayed prescribing; patient decision aids; communication training; commitment to a practice prescribing policy for antibiotics; patient information leaflet; and near patient testing with C-reactive protein. In addition, two sub-studies are nested in the main study: (1) point prevalence estimation carriage of bacterial upper respiratory pathogens in practice staff and asymptomatic patients; (2) feasibility of direct measures of antibiotic resistance by nose/throat swabbing. The main outcome data are from Australia’s national health insurance scheme, Medicare, which will be accessed after the completion of the intervention phase. They include the number of antibiotic prescriptions and the number of patient visits per general practitioner for periods before and during the intervention. The incidence of antibiotic prescriptions will be modelled using the numbers of patients as the denominator and seasonal and other factors as explanatory variables. Results will compare the change in prescription rates before and during the intervention in the two groups of practices. Semi-structured interviews will be conducted with the general practitioners and practice staff (practice nurse and/or practice manager) from the intervention practices on conclusion of the intervention phase to assess the feasibility and uptake of the interventions. An economic evaluation will be conducted to estimate the costs of implementing the package, and its cost-effectiveness in terms of cost per unit reduction in prescribing. Discussion The results on the effectiveness, cost-effectiveness, acceptability and feasibility of this package of interventions will inform the policy for any national implementation.