900 resultados para Google Analytics
Resumo:
Using the dimensional reduction regularization scheme, we show that radiative corrections to the anomaly of the axial current, which is coupled to the gauge field, are absent in a supersymmetric U(1) gauge model for both 't Hooft-Veltman and Bardeen prescriptions for γ5. We also discuss the results with reference to conventional dimensional regularization. This result has significant implications with respect to the renormalizability of supersymmetric models.
Resumo:
Social media analytics is a rapidly developing field of research at present: new, powerful ‘big data’ research methods draw on the Application Programming Interfaces (APIs) of social media platforms. Twitter has proven to be a particularly productive space for such methods development, initially due to the explicit support and encouragement of Twitter, Inc. However, because of the growing commercialisation of Twitter data, and the increasing API restrictions imposed by Twitter, Inc., researchers are now facing a considerably less welcoming environment, and are forced to find additional funding for paid data access, or to bend or break the rules of the Twitter API. This article considers the increasingly precarious nature of ‘big data’ Twitter research, and flags the potential consequences of this shift for academic scholarship.
Resumo:
The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.
Resumo:
First year medical laboratory science students (up to 120) undertake a group e-poster project, based in a blended learning model Google Drive, encompassing Google’s cloud computing software, provides a readily accessible, transparent online space for students to collaborate with each other and realise tangible outcomes from their learning The Cube provides an inspiring digital learning display space for student ‘conference style’ presentations
Resumo:
The feasibility of different modern analytical techniques for the mass spectrometric detection of anabolic androgenic steroids (AAS) in human urine was examined in order to enhance the prevalent analytics and to find reasonable strategies for effective sports drug testing. A comparative study of the sensitivity and specificity between gas chromatography (GC) combined with low (LRMS) and high resolution mass spectrometry (HRMS) in screening of AAS was carried out with four metabolites of methandienone. Measurements were done in selected ion monitoring mode with HRMS using a mass resolution of 5000. With HRMS the detection limits were considerably lower than with LRMS, enabling detection of steroids at low 0.2-0.5 ng/ml levels. However, also with HRMS, the biological background hampered the detection of some steroids. The applicability of liquid-phase microextraction (LPME) was studied with metabolites of fluoxymesterone, 4-chlorodehydromethyltestosterone, stanozolol and danazol. Factors affecting the extraction process were studied and a novel LPME method with in-fiber silylation was developed and validated for GC/MS analysis of the danazol metabolite. The method allowed precise, selective and sensitive analysis of the metabolite and enabled simultaneous filtration, extraction, enrichment and derivatization of the analyte from urine without any other steps in sample preparation. Liquid chromatographic/tandem mass spectrometric (LC/MS/MS) methods utilizing electrospray ionization (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) were developed and applied for detection of oxandrolone and metabolites of stanozolol and 4-chlorodehydromethyltestosterone in urine. All methods exhibited high sensitivity and specificity. ESI showed, however, the best applicability, and a LC/ESI-MS/MS method for routine screening of nine 17-alkyl-substituted AAS was thus developed enabling fast and precise measurement of all analytes with detection limits below 2 ng/ml. The potential of chemometrics to resolve complex GC/MS data was demonstrated with samples prepared for AAS screening. Acquired full scan spectral data (m/z 40-700) were processed by the OSCAR algorithm (Optimization by Stepwise Constraints of Alternating Regression). The deconvolution process was able to dig out from a GC/MS run more than the double number of components as compared with the number of visible chromatographic peaks. Severely overlapping components, as well as components hidden in the chromatographic background could be isolated successfully. All studied techniques proved to be useful analytical tools to improve detection of AAS in urine. Superiority of different procedures is, however, compound-dependent and different techniques complement each other.
Resumo:
The specific objective of this paper is to develop a state space model of a tubular ammonia reactor which is the heart of an ammonia plant in a fertiliser complex. A ninth order model with three control inputs and two disturbance inputs is generated from the nonlinear distributed model using linearization and lumping approximations. The lumped model is chosen such that the steady state temperature at the exit of the catalyst bed computed from the simplified state space model is close enough to the one computed from the nonlinear steady state model. The model developed in this paper is very useful for the design of continuous/discrete versions of single variable/multivariable control algorithms.
Resumo:
The Palghat–Cauvery suture zone in southern India separates Archaean crustal blocks to the north and the Proterozoic Madurai block to the south. Here we present the first detailed study of a partially retrogressed eclogite (from within the Sittampundi anorthositic complex in the suture zone) that occurs as a 20-cm wide layer in a garnet gabbro layer in anorthosite. The eclogite largely consists of an assemblage of coexisting porphyroblasts of almandine–pyrope garnet and augitic clinopyroxene. However, a few garnets contain inclusions of omphacite. Rims and symplectites composed of Na–Ca amphibole and plagioclase form a retrograde assemblage. Petrographic analysis and calculated phase equilibria indicate that garnet–omphacite–rutile–melt was the peak metamorphic assemblage and that it formed at ca. 20 kbar and above 1000 °C. The eclogite was exhumed on a very tight hairpin-type, anticlockwise P–T path, which we relate to subduction and exhumation in the Palghat–Cauvery suture zone. The REE composition of the minerals suggests a basaltic oceanic crustal protolith metamorphosed in a subduction regime. Geological–structural relations combined with geophysical data from the Palghat–Cauvery suture zone suggest that the eclogite facies metamorphism was related to formation of the suture zone. Closure of the Mozambique Ocean led to development of the suture zone and to its western extension in the Betsimisaraka suture of Madagascar.
Resumo:
Control systems arising in many engineering fields are often of distributed parameter type, which are modeled by partial differential equations. Decades of research have lead to a great deal of literature on distributed parameter systems scattered in a wide spectrum.Extensions of popular finite-dimensional techniques to infinite-dimensional systems as well as innovative infinite-dimensional specific control design approaches have been proposed. A comprehensive account of all the developments would probably require several volumes and is perhaps a very difficult task. In this paper, however, an attempt has been made to give a brief yet reasonably representative account of many of these developments in a chronological order. To make it accessible to a wide audience, mathematical descriptions have been completely avoided with the assumption that an interested reader can always find the mathematical details in the relevant references.
Resumo:
This report provides an analysis of the cultural, policy and legal implications of ‘mash-ups’. This study provides a short history of mash-ups, explaining how the current ‘remix culture’ builds upon a range of creative antecedents and cultural traditions, which valorised appropriation, quotation, and transformation. It provides modern examples of mash-ups, such as sound recordings, musical works, film and artistic works, focusing on works seen on You Tube and other online applications. In particular, it considers - * Literary mash-ups of canonical texts, including Pride and Prejudice and Zombies, The Wind Done Gone, After the Rain, and 60 Years Later; * Artistic mash-ups, highlighting the Obama Hope poster, the ‘Column’ case, and the competition for extending famous album covers; * Geographical mash-ups, most notably, the Google Australia bushfires map; * Musical mash-ups, such as The Grey Album and the work of Girl Talk; * Cinematic mash-ups, including remixes of There Will Be Blood and The Downfall; and This survey provides an analysis of why mash-up culture is valuable. It highlights the range of aesthetic, political, comic, and commercial impulses behind the creation and the dissemination of mash-ups. This report highlights the tensions between copyright law and mash-ups in particular cultural sectors. Second, this report emphasizes the importance of civil society institutions in promoting and defending mash-ups in both copyright litigation and policy debates. It provides a study of key organisations – including: * The Fair Use Project; * The Organization for Transformative Works; * Public Knowledge; * The Electronic Frontier Foundation; and * The Chilling Effects Clearinghouse This report suggests that much can be learnt from this network of organisations in the United States. There is a dearth of comparable legal clinics, advocacy groups, and creative institutions in Australia. As a result, the public interest values of copyright law have only received weak, incidental support from defendant companies – such as Network Ten and IceTV – with other copyright agendas. Third, this report canvasses a succinct model for legislative reform in respect of copyright law and mash-ups. It highlights: * The extent to which mash-ups are ‘tolerated uses’; * The conflicting judicial precedents on substantiality in Australia and the United States; * The debate over copyright exceptions relating to mash-ups and remixes; * The use of the take-down and notice system under the safe harbours regime by copyright owners in respect of mash-ups; * The impact of technological protection measures on mash-ups and remixes; * The possibility of statutory licensing in respect of mash-ups; * The use of Creative Commons licences; * The impact of moral rights protection upon mash-ups; * The interaction between economic and moral rights under copyright law; and * Questions of copyright law, freedom of expression, and political mash-ups.
Resumo:
Experimental investigations are carried out in the IISc hypersonic shock tunnel on film cooling effectiveness of a single jet (diameter 2 mm and 0.9 mm), and an array forward facing of micro-jets (diameter 300 mu m each) of same effective area (corresponding to the respective single jet). The single jet and the corresponding micro-jets are injected from the stagnation zone of a blunt cone model (58, apex angle and nose radius of 35 mm). Nitrogen and Helium are injected as coolant gases. Experiments are performed at freestream Mach number 5.9, at 0 degrees angle of attack, with a stagnation enthalpy of 1.84 MJ/kg, with and without injections. The ratios of the jet stagnation pressure to the freestream pitot pressure used in the present study are 1.2 and 1.45. Up to 50% reduction in surface heat transfer rate was observed with the array of micro-jets, compared to that of the respective single jet with nitrogen as the coolant, while the corresponding eduction was up to 37% for helium injection, with the schlieren flow visualizations showing no major change in the shock standoff distance, and thus no major changes in other aerodynamic aspects such as drag.
Resumo:
The problem of an infinite transversely isotropic circular cylindrical shell subjected to an axisymmetric radial external line load is investigated using elasticity theory, classical shell theory and shear deformation theory. The results obtained by these methods are compared for two ratios of inner to outer shell radius and for varying degrees of anisotropy. Some typical results are given here to show the effect of anisotropy and the thickness of the shell on the distribution of stresses and displacements.
Resumo:
The compounds Pb2PtO4 and PbPt2O4 were synthesized from an intimate mixture of yellow PbO and Pt metal powders by heating under pure oxygen gas at 973 K for periods up to 600 ks with intermediate grinding and recompacting. Both compounds were found to decompose on heating in pure oxygen to PbO and Pt, apparently in conflict with the requirements for equilibrium phase relations in the ternary system Pb–Pt–O. The oxygen chemical potential corresponding to the three-phase mixtures, Pb2PtO4 + PbO + Pt and PbPt2O4 + PbO + Pt were measured as a function of temperature using solid-state electrochemical cells incorporating yttria-stabilized zirconia as the solid electrolyte and pure oxygen gas at 0.1 MPa pressure as the reference electrode. The standard Gibbs free energies of formation of the ternary oxides were derived from the measurements. Analysis of the results indicated that the equilibrium involving three condensed phases Pb2PtO4 + PbO + Pt is metastable. Under equilibrium conditions, Pb2PtO4 should have decomposed to a mixture of PbO and PbPt2O4. Measurement of the oxygen potential corresponding to this equilibrium decomposition as a function of temperature indicated that decomposition temperature in pure oxygen is 1014(±2) K. This was further confirmed by direct determination of phase relations in the ternary Pb–Pt–O by equilibrating several compositions at 1023 K for periods up to 850 ks and phase identification of quenched samples using X-ray diffraction (XRD), scanning electron microscopy (SEM) and energy dispersive X-ray spectroscopy (EDS). Only one ternary oxide PbPt2O4 was stable at 1023 K under equilibrium conditions. Alloys and intermetallic compounds along the Pb–Pt binary were in equilibrium with PbO.
Resumo:
Nickel zinc hydroxysalt–Pt metal nanoparticle composite was prepared by intercalation of the anionic platinum complex, [PtCl6]2− in nickel zinc hydroxysalt through ion exchange reaction and subsequent reduction of the platinum complex by ethanol. Powder X-ray diffraction and microscopy studies indicate that the process of reduction of the platinum complex in the interlayer region of the anionic clay takes place topotactically without destroying the layers.
Resumo:
Background The past decade has seen a rapid change in the climate system with an increased risk of extreme weather events. On and following the 3rd of January 2013, Tasmania experienced three catastrophic bushfires, which led to the evacuation of several communities, the loss of many properties, and a financial cost of approximately AUD$80 million. Objective To explore the impacts of the 2012/2013 Tasmanian bushfires on community pharmacies. Method Qualitative research methods were undertaken, employing semi-structured telephone interviews with a purposive sample of seven Tasmanian pharmacists. The interviews were recorded and transcribed, and two different methods were used to analyse the text. The first method utilised Leximancer® text analytics software to provide a birds-eye view of the conceptual structure of the text. The second method involved manual, open and axial coding, conducted independently by the two researchers for inter-rater reliability, to identify key themes in the discourse. Results Two main themes were identified - ‘people’ and ‘supply’ - from which six key concepts were derived. The six concepts were ‘patients’, ‘pharmacists’, ‘local doctor’, ‘pharmacy operations’, ‘disaster management planning’, and ‘emergency supply regulation’. Conclusion This study identified challenges faced by community pharmacists during Tasmanian bushfires. Interviewees highlighted the need for both the Tasmanian State Government and the Australian Federal Government to recognise the important primary care role that community pharmacists play during natural disasters, and therefore involve pharmacists in disaster management planning. They called for greater support and guidance for community pharmacists from regulatory and other government bodies during these events. Their comments highlighted the need for a review of Tasmania’s 3-day emergency supply regulation that allows pharmacists to provide a three-day supply of a patient’s medication without a doctor’s prescription in an emergency situation.
Resumo:
The neutron-antineutron transition amplitude caused by an effective six fermion interaction with strength λeff is calculated within the context of the MIT Bag Model. The transition mass δm is found to have the value λeff×3×10−4(GeV6).