388 resultados para automatic content extraction
Resumo:
In pavement design, resilient modulus of a pavement material is one of the key design parameters. Resilient modulus of a granular pavement material can be measured using repeated load Triaxial (RLT) test or estimated using empirical models. For conventional granular pavement materials, a significant amount of resilient modulus data and empirical models to estimate this key design parameter are available. However, RCA is a relatively new granular pavement material and therefore no such data or empirical models are available. In this study, a number of RLT tests were conducted on RCA sample to investigate the effects of moisture content on its resilient modulus (Mr). It was observed that the resilient modulus of RCA increased with a number of loading cycles but decreased as the moisture content was increased. Further, using RLT test results, empirical models to estimate the resilient modulus of RCA were enhanced and validated.
Resumo:
This paper discusses the following key messages. Taxonomy is (and taxonomists are) more important than ever in times of global change. Taxonomic endeavour is not occurring fast enough: in 250 years since the creation of the Linnean Systema Naturae, only about 20% of Earth's species have been named. We need fundamental changes to the taxonomic process and paradigm to increase taxonomic productivity by orders of magnitude. Currently, taxonomic productivity is limited principally by the rate at which we capture and manage morphological information to enable species discovery. Many recent (and welcomed) initiatives in managing and delivering biodiversity information and accelerating the taxonomic process do not address this bottleneck. Development of computational image analysis and feature extraction methods is a crucial missing capacity needed to enable taxonomists to overcome the taxonomic impediment in a meaningful time frame. Copyright © 2009 Magnolia Press.
Resumo:
Social media enable advertising agencies to engage directly with the public by participating in-and observing-real conversations. The current study recruited a Delphi panel to explore how some of the world's leading advertising professionals view the use of social media to test, track, and evaluate advertising campaigns and how they identify related risks and ethical considerations. The findings suggest that agencies primarily use social media as a tool for understanding consumers and igniting insight, not as a means of testing creative ideas. The authors believe this research provides an important benchmark of agency best practice in social-media research and outlines ethical implications.
Resumo:
In Australia, international tourists/visitors are one of the highest risk groups for drowning at beaches. Swimming in patrolled areas, between the flags, reduces the risk of drowning with most drownings occuring outside these areas. There is a need to understand beliefs which influence the extent to which international tourists/visitors intend to swim between the flags. The theory of planned behaviour (TPB) and, in particular, the indirect beliefs which underpin constructs in the model, represent a means of determining what factors influence this intention. The current study compared international visitors/tourists as having either low or high intentions to swim between the flags on a range of behavioural, normative, and control beliefs. A series of MANOVAs revealed significant differences between the groups in all three of the beliefs. The findings provide insight into potential foci for message content for use in educational campaigns aimed at keeping international visitors safe on Australian beaches.
Resumo:
Diagnosis of articular cartilage pathology in the early disease stages using current clinical diagnostic imaging modalities is challenging, particularly because there is often no visible change in the tissue surface and matrix content, such as proteoglycans (PG). In this study, we propose the use of near infrared (NIR) spectroscopy to spatially map PG content in articular cartilage. The relationship between NIR spectra and reference data (PG content) obtained from histology of normal and artificially induced PG-depleted cartilage samples was investigated using principal component (PC) and partial least squares (PLS) regression analyses. Significant correlation was obtained between both data (R2 = 91.40%, p<0.0001). The resulting correlation was used to predict PG content from spectra acquired from whole joint sample, this was then employed to spatially map this component of cartilage across the intact sample. We conclude that NIR spectroscopy is a feasible tool for evaluating cartilage contents and mapping their distribution across mammalian joint
Resumo:
This thesis examines the confluence of digital technology, evolving classroom pedagogy and young people's screen use, demonstrating how screen content can be deployed, curated, and developed for effective use in contemporary classrooms. Based on four detailed case studies drawn from the candidate's professional creative practice, the research presents a set of design considerations for educational media that distill the relevance of the research for screen producers seeking to develop a more productive understanding of and engagement with the school education sector.
Resumo:
We present an overview of the QUT plant classification system submitted to LifeCLEF 2014. This system uses generic features extracted from a convolutional neural network previously used to perform general object classification. We examine the effectiveness of these features to perform plant classification when used in combination with an extremely randomised forest. Using this system, with minimal tuning, we obtained relatively good results with a score of 0:249 on the test set of LifeCLEF 2014.
Resumo:
In this chapter, we explore methods for automatically generating game content—and games themselves—adapted to individual players in order to improve their playing experience or achieve a desired effect. This goes beyond notions of mere replayability and involves modeling player needs to maximize their enjoyment, involvement, and interest in the game being played. We identify three main aspects of this process: generation of new content and rule sets, measurement of this content and the player, and adaptation of the game to change player experience. This process forms a feedback loop of constant refinement, as games are continually improved while being played. Framed within this methodology, we present an overview of our recent and ongoing research in this area. This is illustrated by a number of case studies that demonstrate these ideas in action over a variety of game types, including 3D action games, arcade games, platformers, board games, puzzles, and open-world games. We draw together some of the lessons learned from these projects to comment on the difficulties, the benefits, and the potential for personalized gaming via adaptive game design.