194 resultados para Popularity.
Resumo:
Integration of biometrics is considered as an attractive solution for the issues associated with password based human authentication as well as for secure storage and release of cryptographic keys which is one of the critical issues associated with modern cryptography. However, the widespread popularity of bio-cryptographic solutions are somewhat restricted by the fuzziness associated with biometric measurements. Therefore, error control mechanisms must be adopted to make sure that fuzziness of biometric inputs can be sufficiently countered. In this paper, we have outlined such existing techniques used in bio-cryptography while explaining how they are deployed in different types of solutions. Finally, we have elaborated on the important facts to be considered when choosing appropriate error correction mechanisms for a particular biometric based solution.
Resumo:
Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.
Resumo:
Social media is now an integral part of modern sports broadcasting, which combines old and new media into a redefined and multidimensional experience for fans. The popularity of social media has particular implications for professional women's sports due to this convergence, and may be utilised by organisations to address some of the issues women's sports face from a lack of traditional broadcast coverage. This article discusses Twitter activity surrounding the ANZ Championship netball competition and analyses the ways social media can help transcend the structural challenges that “old” media has placed on professional women's sports.
Resumo:
Structural Health Monitoring (SHM) schemes are useful for proper management of the performance of structures and for preventing their catastrophic failures. Vibration based SHM schemes has gained popularity during the past two decades resulting in significant research. It is hence evitable that future SHM schemes will include robust and automated vibration based damage assessment techniques (VBDAT) to detect, localize and quantify damage. In this context, the Damage Index (DI) method which is classified as non-model or output based VBDAT, has the ability to automate the damage assessment process without using a computer or numerical model along with actual measurements. Although damage assessment using DI methods have been able to achieve reasonable success for structures made of homogeneous materials such as steel, the same success level has not been reported with respect to Reinforced Concrete (RC) structures. The complexity of flexural cracks is claimed to be the main reason to hinder the applicability of existing DI methods in RC structures. Past research also indicates that use of a constant baseline throughout the damage assessment process undermines the potential of the Modal Strain Energy based Damage Index (MSEDI). To address this situation, this paper presents a novel method that has been developed as part of a comprehensive research project carried out at Queensland University of Technology, Brisbane, Australia. This novel process, referred to as the baseline updating method, continuously updates the baseline and systematically tracks both crack formation and propagation with the ability to automate the damage assessment process using output only data. The proposed method is illustrated through examples and the results demonstrate the capability of the method to achieve the desired outcomes.
Resumo:
This article uses sports coverage as a lens to analyse changes in broadcast television (free-to-air [FTA] and subscription) in Australia from the 1950s to the present. Sport has always been a vital genre for broadcast television. It is now, arguably, more important than ever. It is indisputable – though rarely comprehensively documented – that sport and sports coverage have shaped and transformed Australian television over many years. The significance of sports has incrementally increased with successive technological and industrial developments – such as the introduction of colour in 1975, electronic news gathering from 1976, subscription television in 1995, digital terrestrial broadcasting in 2001 and digital subscription broadcasting in 2004 – to the point where broadcast television’s continuing popularity and ongoing cultural significance relies to a great extent on sports coverage and related programming. In 2015, the launch of a bevy of new subscription video on demand (SVOD) services in Australia might appear to have reinforced drama as the key genre in the battle for attention and engagement, but for both historical and contemporary reasons sport remains the crucial form of audiovisual content.
Resumo:
Meat/meat alternatives (M/MA) are key sources of Fe, Zn and protein, but intake tends to be low in young children. Australian recommendations state that Fe-rich foods, including M/MA, should be the first complementary foods offered to infants. The present paper reports M/MA consumption of Australian infants and toddlers, compares intake with guidelines, and suggests strategies to enhance adherence to those guidelines. Mother–infant dyads recruited as part of the NOURISH and South Australian Infants Dietary Intake studies provided 3 d of intake data at three time points: Time 1 (T1) (n 482, mean age 5·5 (SD 1·1) months), Time 2 (T2) (n 600, mean age 14·0 (SD 1·2) months) and Time 3 (T3) (n 533, mean age 24 (SD 0·7) months). Of 170 infants consuming solids and aged greater than 6 months at T1, 50 (29 %) consumed beef, lamb, veal (BLV) or pork on at least one of 3 d. Commercial infant foods containing BLV or poultry were the most common form of M/MA consumed at T1, whilst by T2 BLV mixed dishes (including pasta bolognaise) became more popular and remained so at T3. The processed M/MA increased in popularity over time, led by pork (including ham). The present study shows that M/MA are not being eaten by Australian infants or toddlers regularly enough; or in adequate quantities to meet recommendations; and that the form in which these foods are eaten can lead to smaller M/MA serve sizes and greater Na intake. Parents should be encouraged to offer M/MA in a recognisable form, as one of the first complementary foods, in order to increase acceptance at a later age.
New paradigm, new educational requirements? Australian viewpoints on education for digital libraries
Resumo:
The rise in popularity of the digital library has lead to studies addressing digital library education and curricula development to emanate from the United States and Europe. However, to date very little research has been conducted with an Australian focus. Additionally, very few studies worldwide have sought the opinions of practitioners and the influence that these opinions may have on developing appropriate digital library curricula. The current paper is drawn from a larger study which sought to determine the skills and knowledge required of library and information professionals to work in a digital library environment. Data were collected via an online questionnaire from two target groups: practitioners working in academic libraries and Library and Information Science (LIS) educators across Australia. This paper examines in depth the findings from the survey specifically relating to the following topics. Firstly, whether or not there is a need for an educational programme to be targeted solely at the digital library environment. Secondly, the preferred delivery options for such a programme, and preferred models of digital library education. In addition, a determination on the elements which should be included in the curricula of a digital library education programme are discussed. Findings are compared and discussed with reference to the literature which informed the study. Finally, implications for the sustainability of library education programmes in Australia are identified and directions for further research highlighted.
Resumo:
In an essay, "The Books of Last Things", Delia Falconer discusses the emergence of a new genre in publishing - microhistories. She cites a number of recent titles in non-fiction and fiction - Longitude, Cod, Tulips, Pushkin's Button, Nathaniel's Nutmeg, Zarafa, The Surgeon of Crowthorne, The Potato, The Perfect Storm. Delia Falconer observes of this tradition: "One has the sense, reading these books, of a surprising weight, of pleasant shock. In part, it is because we are looking at things which are generally present around us, but modestly out of sight and mind - historical nitty gritty like cod, potatoes, longitudinal clocks - which the authors have thrust suddenly, like a Biblical visitation of frogs or locusts, in our face. Things like spice and buttons and clocks are generally seen to enable history on the large scale, but are not often viewed as its worthy subjects. And by the same grand logic of history, more unusual phenomena like cabinets of curiosities or glass-making or farm lore or sailors' knots are simply odd blips on its radar screen, interesting footnotes. These new books, microhistories, reverse the usual order of history, which argues from the general to the particular, in order to prove its inevitable progress. They start from the footnotes. But by reversing the process, and walking through the back door of history, you don't necessarily end up at the front of the same house." Delia Falconer speculates about the reasons for the popularity of microhistories. She concludes: "I would like to think that reading them is not simply an exercise in nostalgia, but a challenge to the present". In Mauve, Simon Garfield provides a new way of thinking and writing about the history of intellectual property. Instead of providing a grand historical narrative of intellectual property, he tells the story of a particular invention, and its exploitation. Simon Garfield relates how English chemist William Perkin accidentally discovered a way to mass-produce colour mauve in a factory. Working on a treatment for malaria in his London home laboratory, Perkin failed to produce artificial quinine. Instead he created a dark oily sludge that turned silk a beautiful light purple. The colour was unique and became the most desirable shade in the fashion houses of Paris and London. ... The book Mauve will have a number of contemporary resonances for intellectual property lawyers and academics. Simon Garfield emphasizes the difficulties inherent in commercialising an invention and managing intellectual property. He investigates the uneasy collaboration between industry and science. Simon Garfield suggests that complaints about the efficacy of patent offices are perennial. He also highlights the problems faced by courts and law-makers in accommodating new technologies within the logic of patent law. In his elegant microhistory of the colour mauve, Simon Garfield confirms the conclusion of Brad Sherman and Lionel Bently that many aspects of modern intellectual property law can only be understood through an understanding of the past: "The image of intellectual property law that developed during the 19th century and the narrative of identity which this engendered played and continue to play an important role in the way we think about and understand intellectual property law".
Resumo:
Abstract: Over the years bioelectrical impedance assay (BIA) has gained popularity in the assessment of body composition. However, equations for the prediction of whole body composition use whole body BIA. This study attempts to evaluate the usefulness of segmental BIA in the assessment of whole body composition. A cross sectional descriptive study was conducted at the Professorial Paediatric Unit of Lady Ridgeway Hospital, Colombo, involving 259 (M/F:144/115) 5 to 15 year old healthy children. The height, weight, total and segmental BIA were measured and impedance indices and specific resistivity for the whole body and segments were calculated. Segmental BIA indices showed a significant association with whole body composition measures assessed by total body water (TBW) using the isotope dilution method (D2O). Impedance index was better related to TBW and fat free mass (FFM), while specific resistivity was better related to the fat mass of the body. Regression equations with different combinations of variables showed high predictability of whole body composition. Results of this study showed that segmental BIA can be used as an alternative approach to predict the whole body composition in Sri Lankan children.
Resumo:
Despite longstanding concern with the dimensionality of the service quality construct as measured by ServQual and IS-ServQual instruments, variations on the IS-ServQual instrument have been enduringly prominent in both academic research and practice in the field of IS. We explain the continuing popularity of the instrument based on the salience of the item set for predicting overall customer satisfaction, suggesting that the preoccupation with the dimensions has been a distraction. The implicit mutual exclusivity of the items suggests a more appropriate conceptualization of IS-ServQual as a formative index. This conceptualization resolves the paradox in IS-ServQual research, that of how an instrument with such well-known and well-documented weaknesses continue to be very influential and widely used by academics and practitioners. A formative conceptualization acknowledges and addresses the criticisms of IS-ServQual, while simultaneously explaining its enduring salience by focusing on the items rather than the “dimensions.” By employing an opportunistic sample and adopting the most recent IS-ServQual instrument published in a leading IS journal (virtually, any valid IS- ServQual sample in combination with a previously tested instrument variant would suffice for study purposes), we demonstrate that when re-specified as both first-order and second-order formatives, IS-ServQual has good model quality metrics and high predictive power on customer satisfaction. We conclude that this formative specification has higher practical use and is more defensible theoretically.
Resumo:
In 2013, social networking was the second most popular online activity after internet banking for Australians (ABS, 2014). The popularity and apparent ubiquity of social media is one of the most obvious and compelling arguments for integrating such technologies into higher education. Already, social media impacts a wide range of activities ranging in scope from marketing and communication to teaching and learning in higher education (Hrastinski & Dennen, 2012). Social media presents many exciting possibilities and opportunities for higher education. This session will focus on one staff focussed and one student focussed social media innovation currently underway at QUT. First, it will focus on the actions of QUT’s social media working group. The working group’s aim is to ensure an overarching social media policy for the university is developed and implemented that supports staff in the use of social media across a range of activities. Second, it will discuss the eResponsible and eProfessional Online resources for students project. The focus of this project is to develop a suite of online resources targeted at the devel opment of social media skills for undergraduate students at QUT. These initiatives are complementary and both aim to minimise risk while maximising opportuniti es for the university
Resumo:
Hamstring strain injuries are the predominant injury in many sports, costing athletes and clubs a significant financial and performance burden. Therefore the ability to identify and intervene with individuals who are considered at a high risk of injury is important. One measure which has grown in popularity as an outcome variable following hamstring intervention/prevention studies and rehabilitation is the angle of peak knee flexor torque. This current opinion article will firstly introduce the measure and the processes behind it. Secondly, this article will summarise how the angle of peak knee flexor torque has been suggested to measure hamstring strain injury risk. Finally various limitations will be presented and outlined as to how they may influence the measure. These include the lack of muscle specificity, the common concentric contraction mode of assessment, reliability of the measure, various neural contributions (such as rate of force development and neuromuscular inhibition) as well as the lack of prospective data showing any predictive value in the measure.
Resumo:
Introduction and Aims Wastewater analysis provides a non-intrusive way of measuring drug use within a population. We used this approach to determine daily use of conventional illicit drugs [cannabis, cocaine, methamphetamine and 3,4-methylenedioxymethamphetamine (MDMA)] and emerging illicit psychostimulants (benzylpiperazine, mephedrone and methylone) in two consecutive years (2010 and 2011) at an annual music festival. Design and Methods Daily composite wastewater samples, representative of the festival, were collected from the on-site wastewater treatment plant and analysed for drug metabolites. Data over 2 years were compared using Wilcoxon matched-pair test. Data from 2010 festival were compared with data collected at the same time from a nearby urban community using equivalent methods. Results Conventional illicit drugs were detected in all samples whereas emerging illicit psychostimulants were found only on specific days. The estimated per capita consumption of MDMA, cocaine and cannabis was similar between the two festival years. Statistically significant (P < 0.05; Z = −2.0–2.2) decreases were observed in use of methamphetamine and one emerging illicit psychostimulant (benzyl piperazine). Only consumption of MDMA was elevated at the festival compared with the nearby urban community. Discussion and Conclusions Rates of substance use at this festival remained relatively consistent over two monitoring years. Compared with the urban community, drug use among festival goers was only elevated for MDMA, confirming its popularity in music settings. Our study demonstrated that wastewater analysis can objectively capture changes in substance use at a music setting without raising major ethical issues. It would potentially allow effective assessments of drug prevention strategies in such settings in the future.
Resumo:
Concentrations of several pesticides were monitored in a paddy block and in the Kose river, which drains a paddy catchment in Fukuoka prefecture, Japan. Detailed water management in the block was also monitored to evaluate its effect on the pesticide contamination. The concentrations of applied pesticides in both block irrigation channel and drainage canal increased to tens of μg/L shortly after their applications. The increase in pesticide concentrations was well correlated with the open of irrigation and drainage gates in the pesticide-applied paddy plots only 1–3 days after pesticide application. High concentration of other pesticides, mainly herbicides, was also observed in the inflow irrigation and drainage waters, confirming the popularity of early irrigation and drainage after pesticide application in the area. The requirement of holding water after pesticide application (as a best management practice) issued by the authority was thus not properly followed. In a larger scale of the paddy catchment, the concentration of pesticides also increased significantly to several μg/L in the water of the Kose river shortly after the start of the pesticide application period either in downstream or mid–upstream areas, confirming the effect of current water management to the water quality. More extension and enforcement on water management should be done in order to control pesticide pollution from rice cultivation in Japan.
Resumo:
"This chapter reviews the capacity of the discipline field to account for the velocity and quality of digitally-driven transformations, while making a case for a "middle range" approach that steers between unbridled optimism ("all-change") and determined scepticism ("Continuity") about the potential of such change. The chapter focuses on online screen distribution as a case study, considering the evidence for, and significance of, change in industry structure and the main payers, how content is produced and by whom, the nature of content, and the degree to which online screen distribution has reached thresholds of mainstream popularity."