129 resultados para Trace form


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 510 million year old Kalkarindji Large Igneous Province correlates in time with the first major extinction event after the Cambrian explosion of life. Large igneous provinces correlate with all major mass extinction events in the last 500 million years. The genetic link between large igneous provinces and mass extinction remains unclear. My work is a contribution towards understanding magmatic processes involved in the generation of Large Igneous Provinces. I concentrate on the origin of variation in Cr in magmas and have developed a model in which high temperature melts intrude into and assimilate large amounts of upper continental crust.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A facile and sensitive surface-enhanced Raman scattering substrate was prepared by controlled potentiostatic deposition of a closely packed single layer of gold nanostructures (AuNS) over a flat gold (pAu) platform. The nanometer scale inter-particle distance between the particles resulted in high population of ‘hot spots’ which enormously enhanced the scattered Raman photons. A renewed methodology was followed to precisely quantify the SERS substrate enhancement factor (SSEF) and it was estimated to be (2.2 ± 0.17) × 105. The reproducibility of the SERS signal acquired by the developed substrate was tested by establishing the relative standard deviation (RSD) of 150 repeated measurements from various locations on the substrate surface. A low RSD of 4.37 confirmed the homogeneity of the developed substrate. The sensitivity of pAu/AuNS was proven by determining 100 fM 2,4,6-trinitrotoluene (TNT) comfortably. As a proof of concept on the potential of the new pAu/AuNS substrate in field analysis, TNT in soil and water matrices was selectively detected after forming a Meisenheimer complex with cysteamine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research was developed between Australia and Papua New Guinea (PNG) over two years investigating ways in which theatre for development could be held accountable for the claims it makes especially in PNG. The motivation to improve theatre for development (TfD) practice was triggered by the desire to enhance the democratic processes of collaboration and co–creativity often lacking in TfD activity in Papua New Guinea. Through creative practice as research and reflective processes, working with established and experienced local community theatre practitioners, a new form of theatre for development, Theatre in Conversations evolved. This form integrated three related genres of TfD including process drama, community theatre and community conversations. The suitability and impact of Theatre in Conversations was tested in three remote villages in PNG. Findings and outputs from the study have the potential to be used by theatre for development practitioners in other countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Email is rapidly replacing other forms of communication as the preferred means of communication between contracting parties. The recent decision of Stellard Pty Ltd v North Queensland Fuel Pty Ltd [2015] QSC 119 reinforces the judicial acceptance of email as an effective means of creating a binding agreement and the willingness to adopt a liberal concept of ‘signing’ in an electronic environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A miniaturized flow-through system consisting of a gold coated silicon substrate based on enhanced Raman spectroscopy has been used to study the detection of vapour from model explosive compounds. The measurements show that the detectability of the vapour molecules at room temperature depends sensitively on the interaction between the molecule and the substrate. The results highlight the capability of a flow system combined with Raman spectroscopy for detecting low vapour pressure compounds with a limit of detection of 0.2 ppb as demonstrated by the detection of bis(2-ethylhexyl)phthalate, a common polymer additive emitted from a commercial polyvinyl chloride (PVC) tubing at room temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Membrane proteins play important roles in many biochemical processes and are also attractive targets of drug discovery for various diseases. The elucidation of membrane protein types provides clues for understanding the structure and function of proteins. Recently we developed a novel system for predicting protein subnuclear localizations. In this paper, we propose a simplified version of our system for predicting membrane protein types directly from primary protein structures, which incorporates amino acid classifications and physicochemical properties into a general form of pseudo-amino acid composition. In this simplified system, we will design a two-stage multi-class support vector machine combined with a two-step optimal feature selection process, which proves very effective in our experiments. The performance of the present method is evaluated on two benchmark datasets consisting of five types of membrane proteins. The overall accuracies of prediction for five types are 93.25% and 96.61% via the jackknife test and independent dataset test, respectively. These results indicate that our method is effective and valuable for predicting membrane protein types. A web server for the proposed method is available at http://www.juemengt.com/jcc/memty_page.php

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel, highly selective resonance light scattering (RLS) method was researched and developed for the analysis of phenol in different types of industrial water. An important aspect of the method involved the use of graphene quantum dots (GQDs), which were initially obtained from the pyrolysis of citric acid dissolved in aqueous solutions. The GQDs in the presence of horseradish peroxidase (HRP) and H2O2 were found to react quantitatively with phenol such that the RLS spectral band (310 nm) was quantitatively enhanced as a consequence of the interaction between the GQDs and the quinone formed in the above reaction. It was demonstrated that the novel analytical method had better selectivity and sensitivity for the determination of phenol in water as compared to other analytical methods found in the literature. Thus, trace amounts of phenol were detected over the linear ranges of 6.00×10−8–2.16×10−6 M and 2.40×10−6–2.88×10−5 M with a detection limit of 2.20×10−8 M. In addition, three different spiked waste water samples and two untreated lake water samples were analysed for phenol. Satisfactory results were obtained with the use of the novel, sensitive and rapid RLS method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Foot disease complications, such as foot ulcers and infection, contribute to considerable morbidity and mortality. These complications are typically precipitated by “high-risk factors”, such as peripheral neuropathy and peripheral arterial disease. High-risk factors are more prevalent in specific “at risk” populations such as diabetes, kidney disease and cardiovascular disease. To the best of the authors’ knowledge a tool capturing multiple high-risk factors and foot disease complications in multiple at risk populations has yet to be tested. This study aimed to develop and test the validity and reliability of a Queensland High Risk Foot Form (QHRFF) tool. Methods The study was conducted in two phases. Phase one developed a QHRFF using an existing diabetes foot disease tool, literature searches, stakeholder groups and expert panel. Phase two tested the QHRFF for validity and reliability. Four clinicians, representing different levels of expertise, were recruited to test validity and reliability. Three cohorts of patients were recruited; one tested criterion measure reliability (n = 32), another tested criterion validity and inter-rater reliability (n = 43), and another tested intra-rater reliability (n = 19). Validity was determined using sensitivity, specificity and positive predictive values (PPV). Reliability was determined using Kappa, weighted Kappa and intra-class correlation (ICC) statistics. Results A QHRFF tool containing 46 items across seven domains was developed. Criterion measure reliability of at least moderate categories of agreement (Kappa > 0.4; ICC > 0.75) was seen in 91% (29 of 32) tested items. Criterion validity of at least moderate categories (PPV > 0.7) was seen in 83% (60 of 72) tested items. Inter- and intra-rater reliability of at least moderate categories (Kappa > 0.4; ICC > 0.75) was seen in 88% (84 of 96) and 87% (20 of 23) tested items respectively. Conclusions The QHRFF had acceptable validity and reliability across the majority of items; particularly items identifying relevant co-morbidities, high-risk factors and foot disease complications. Recommendations have been made to improve or remove identified weaker items for future QHRFF versions. Overall, the QHRFF possesses suitable practicality, validity and reliability to assess and capture relevant foot disease items across multiple at risk populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient method for the analysis of hydroquinone at trace levels in water samples has been developed in the form of a fluorescent probe based on graphene quantum dots (GQDs). The analytical variable, fluorescence quenching, was generated from the formation of benzoquinone intermediates, which formed during the catalytic oxidation of hydroquinone by horseradish peroxidase (HRP). In general, the reaction mechanism involved hydroquinone, as an electron acceptor, which affected the surface state of GQDs via an electron transfer effect. The water-soluble GQDs were directly prepared by the pyrolysis of citric acid and with the use of the mentioned hybrid enzyme system, the detection limit for hydroquinone was as low as 8.4 × 10−8 M. Furthermore, this analysis was almost unaffected by other phenol and quinine compounds, such as phenol, resorcinol and other quinines, and therefore, the developed GQD method produced satisfactory results for the analysis of hydroquinone in several different lake water samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a critical review of the scope of the literacy curriculum in the twenty-first century, uncovering the strengths, controversies, and silences that have divided literacy researchers and educators. It conceptualizes the literacy curriculum as a particular set of socially organized symbolic practices that are always selective, and which are inextricably connected to the function and organization of schooling. We trace the political, historical, and ideological antecedents of literacy curriculum and schooling as a form of cultural apparatus of the nation-state, before tracing some of the major interpretive paradigms that have influenced the shape of the literacy curriculum in many parts of the world. These include debates about skills-based approaches, whole language, systemic functional grammar, and critical literacy. It then draws attention to noteworthy advances and shifts in the field over recent decades: debates about the role of orality in the literacy curriculum, home-school community literacy practices, teacher and student knowledge of language and grammar, and the role of curriculum area literacies. It anticipates the future of the literacy curriculum in online textual environments and the changing sensorial and material nature of literacy practices, while acknowledging that curriculum innovation is always limited in complex ways by historically established pedagogic discourses of schooling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital media have contributed to significant disruptions in the business of audience measurement. Television broadcasters have long relied on simple and authoritative measures of who is watching what. The demand for ratings data, as a common currency in transactions involving advertising and program content, will likely remain, but accompanying measurements of audience engagement with media content would also be of value. Today's media environment increasingly includes social media and second-screen use, providing a data trail that affords an opportunity to measure engagement. If the limitations of using social media to indicate audience engagement can be overcome, social media use may allow for quantitative and qualitative measures of engagement. Raw social media data must be contextualized, and it is suggested that tools used by sports analysts be incorporated to do so. Inspired by baseball's Sabremetrics, the authors propose Telemetrics in an attempt to separate actual performance from contextual factors. Telemetrics facilitates measuring audience activity in a manner controlling for factors such as time slot, network, and so forth. It potentially allows both descriptive and predictive measures of engagement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Codex Alimentarius Commission of the Food and Agriculture Organization of the United Nations (FAO) and the World Health Organization (WHO) develops food standards, guidelines and related texts for protecting consumer health and ensuring fair trade practices globally. The major part of the world's population lives in more than 160 countries that are members of the Codex Alimentarius. The Codex Standard on Infant Formula was adopted in 1981 based on scientific knowledge available in the 1970s and is currently being revised. As part of this process, the Codex Committee on Nutrition and Foods for Special Dietary Uses asked the ESPGHAN Committee on Nutrition to initiate a consultation process with the international scientific community to provide a proposal on nutrient levels in infant formulae, based on scientific analysis and taking into account existing scientific reports on the subject. ESPGHAN accepted the request and, in collaboration with its sister societies in the Federation of International Societies on Pediatric Gastroenterology, Hepatology and Nutrition, invited highly qualified experts in the area of infant nutrition to form an International Expert Group (IEG) to review the issues raised. The group arrived at recommendations on the compositional requirements for a global infant formula standard which are reported here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.