984 resultados para Concept Map


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on the views of Singaporean teachers of a mandated curriculum innovation aimed at changing the nature of games pedagogy within the physical education curriculum framework in Singapore. Since its first appearance over 20 years ago, Teaching Games for Understanding (TGfU), as an approach to games pedagogy has gathered support around the world. Through a process of evolution TGfU now has many guises and one of the latest of these is the Games Concept Approach (GCA) a name given to this pedagogical approach in Singapore. As part of a major national curricular reform project the GCA was identified as the preferred method of games teaching and as a result was mandated as required professional practice within physical education teaching. To prepare teachers for the implementation phase, a training program was developed by the National Institute of Education in conjunction with the Ministry of Education and well known experts in the field from the United States. For this part of the study, 22 teachers from across Singapore were interviewed. The data were used to create three fictional narratives, a process described by Sparkes (2002a) and used more recently by Ryan (2005) in the field of literacy. The stories were framed using Foucault’s (1980/1977) notion of governmentality and Bernstein’s (1996) notion of regulative discourse. The narratives reveal tales of confusion, frustration but also of hope and enthusiasm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How do celebrities like Gordon Ramsay appeal to consumers? This article examines one explanation. We study how celebrities appeal to consumers in the context of celebrity chefs. We examine how a consumer's self-concept clarity (SCC) interacts with their perception of the meaning that a celebrity endorser possesses. An experiment comparing fictional ads endorsed by different celebrity chefs yields the surprising result that consumers with a clear sense of who they are (high-SCC consumers) are more influenced by an ad featuring a celebrity high in meaning (Ramsay), whereas low-SCC consumers are influenced to slightly higher levels by a celebrity with lower levels of celebrity meaning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of media influence has a long history in media and communication studies, and has also had significant influence on public policy. This article revisits questions of media influence through three short case studies. First, it critically analyses the strongly partisan position of News Corporation’s newspapers against the Labor government during the 2013 Australian Federal election to consider whether the potential for media influence equated to the effective use of media power. Second, it discusses the assumption in broadcasting legislation, in both the United Kingdom and Australia, that terrestrial broadcasting should be subject to more content regulation than subscription services, and notes the new challenges arising from digital television and over-the-top video streaming services. Finally, it discusses the rise of multi-platform global content aggregators such as Google, Apple, Microsoft and others, and how their rise necessitates changes in ways of thinking about concentration of media ownership, and regulations that may ensue from it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis critically explored the concept of collaboration through an analysis of the experiences of midwives, child health nurses and women in the process of transition from hospital to community care and related policy documents. The research concluded that the concept serves an important social function in obscuring the complexity of social relations in healthcare. Rather than adopt an unquestioning attitude to what is represented as collaboration this thesis argues for a more critical examination of what is occurring, what is potentially hidden and how specific interests are served through its use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural disasters cause widespread disruption, costing the Australian economy $6.3 billion per year, and those costs are projected to rise incrementally to $23 billion by 2050. With more frequent natural disasters with greater consequences, Australian communities need the ability to prepare and plan for them, absorb and recover from them, and adapt more successfully to their effects. Enhancing Australian resilience will allow us to better anticipate disasters and assist in planning to reduce losses, rather than just waiting for the next king hit and paying for it afterwards. Given the scale of devastation, governments have been quick to pick up the pieces when major natural disasters hit. But this approach (‘The government will give you taxpayers’ money regardless of what you did to help yourself, and we’ll help you rebuild in the same risky area.’) has created a culture of dependence. This is unsustainable and costly. In 2008, ASPI published Taking a punch: building a more resilient Australia. That report emphasised the importance of strong leadership and coordination in disaster resilience policymaking, as well as the value of volunteers and family and individual preparation, in managing the effects of major disasters. This report offers a roadmap for enhancing Australia’s disaster resilience, building on the 2011 National Strategy for Disaster Resilience. It includes a snapshot of relevant issues and current resilience efforts in Australia, outlining key challenges and opportunities. The report sets out 11 recommendations to help guide Australia towards increasing national resilience, from individuals and local communities through to state and federal agencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This special issue of Cultural Science Journal is devoted to the report of a groundbreaking experiment in re-coordinating global markets for specialist scholarly books and enabling the knowledge commons: the Knowledge Unlatched proof-of-concept pilot. The pilot took place between January 2012 and September 2014. It involved libraries, publishers, authors, readers and research funders in the process of developing and testing a global library consortium model for supporting Open Access books. The experiment established that authors, librarians, publishers and research funding agencies can work together in powerful new ways to enable open access; that doing so is cost effective; and that a global library consortium model has the potential dramatically to widen access to the knowledge and ideas contained in book-length scholarly works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a precursor to the 2014 G20 Leaders’ Summit held in Brisbane, Australia, the Queensland Government sponsored a program of G20 Cultural Celebrations, designed to showcase the Summit’s host city. The cultural program’s signature event was the Colour Me Brisbane festival, a two-week ‘citywide interactive light and projection installations’ festival that was originally slated to run from 24 October to 9 November, but which was extended due to popular demand to conclude with the G20 Summit itself on 16 November. The Colour Me Brisbane festival comprised a series projection displays that promoted visions of the city’s past, present, and future at landmark sites and iconic buildings throughout the city’s central business district and thus transformed key buildings into forms of media architecture. In some instances the media architecture installations were interactive, allowing the public to control aspects of the projections through a computer interface situated in front of the building; however, the majority of the installations were not interactive in this sense. The festival was supported by a website that included information regarding the different visual and interactive displays and links to social media to support public discussion regarding the festival (Queensland Government 2014). Festival-goers were also encouraged to follow a walking-tour map of the projection sites that would take them on a 2.5 kilometre walk from Brisbane’s cultural precinct, through the city centre, concluding at parliament house. In this paper, we investigate the Colour Me Brisbane festival and the broader G20 Cultural Celebrations as a form of strategic placemaking—designed, on the one hand, to promote Brisbane as a safe, open, and accessible city in line with the City Council’s plan to position Brisbane as a ‘New World City’ (Brisbane City Council 2014). On the other hand, it was deployed to counteract growing local concerns and tensions over the disruptive and politicised nature of the G20 Summit by engaging the public with the city prior to the heightened security and mobility restrictions of the Summit weekend. Harnessing perspectives from media architecture (Brynskov et al. 2013), urban imaginaries (Cinar & Bender 2007), and social media analysis, we take a critical approach to analysing the government-sponsored projections, which literally projected the city onto itself, and public responses to them via the official, and heavily promoted, social media hashtags (#colourmebrisbane and #g20cultural). Our critical framework extends the concepts of urban phantasmagoria and urban imaginaries into the emerging field of media architecture to scrutinise its potential for increased political and civic engagement. Walter Benjamin’s concept of phantasmagoria (Cohen 1989; Duarte, Firmino, & Crestani 2014) provides an understanding of urban space as spectacular projection, implicated in commodity and techno-culture. The concept of urban imaginaries (Cinar & Bender 2007; Kelley 2013)—that is, the ways in which citizens’ experiences of urban environments are transformed into symbolic representations through the use of imagination—similarly provides a useful framing device in thinking about the Colour Me Brisbane projections and their relation to the construction of place. Employing these critical frames enables us to examine the ways in which the installations open up the potential for multiple urban imaginaries—in the sense that they encourage civic engagement via a tangible and imaginative experience of urban space—while, at the same time, supporting a particular vision and way of experiencing the city, promoting a commodified, sanctioned form of urban imaginary. This paper aims to dissect the urban imaginaries intrinsic to the Colour Me Brisbane projections and to examine how those imaginaries were strategically deployed as place-making schemes that choreograph reflections about and engagement with the city.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This cross disciplinary study was conducted as two research and development projects. The outcome is a multimodal and dynamic chronicle, which incorporates the tracking of spatial, temporal and visual elements of performative practice-led and design-led research journeys. The distilled model provides a strong new approach to demonstrate rigour in non-traditional research outputs including provenance and an 'augmented web of facticity'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creative digital media are increasingly utilized by companies in all industries. Here cases studies of creative media innovations in manufacturing, mining and education were facilitated and evaluated. The cases dealt respectively with designs in manufacturing, visualizing mining data, and developing tools for adult literacy. The difficulties of merging creative media teams into these different contexts were noted and the idea of creative interoperability was developed. Creative interoperability explains how creative teams can connect with other disciplines to bring about innovations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background There has been growing interest in mixed species plantation systems because of their potential to provide a range of socio-economic and bio-physical benefits which can be matched to the diverse needs of smallholders and communities. Potential benefits include the production of a range of forest products for home and commercial use; improved soil fertility especially when nitrogen fixing species are included; improved survival rates and greater productivity of species; a reduction in the amount of damage from pests or disease; and improved biodiversity and wildlife habitats. Despite these documented services and growing interest in mixed species plantation systems, the actual planting areas in the tropics are low, and monocultures are still preferred for industrial plantings and many reforestation programs because of perceived higher economic returns and readily available information about the species and their silviculture. In contrast, there are few guidelines for the design and management of mixed-species systems, including the social and ecological factors of successful mixed species plantings. Methods This protocol explains the methodology used to investigate the following question: What is the available evidence for the relative performance of different designs of mixed-species plantings for smallholder and community forestry in the tropics? This study will systematically search, identify and describe studies related to mixed species plantings across tropical and temperate zones to identify the social and ecological factors that affect polyculture systems. The objectives of this study are first to identify the evidence of biophysical or socio-economic factors that have been considered when designing mixed species systems for community and smallholder forestry in the tropics; and second, to identify gaps in research of mixed species plantations. Results of the study will help create guidelines that can assist practitioners, scientists and farmers to better design mixed species plantation systems for smallholders in the tropics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the development of technological systems the focus of system analysis is often on the sub-system that delivers insufficient performance – i.e. the reverse salient – which subsequently limits the performance of the system in its entirety. The reverse salient is therefore a useful concept in the study of technological systems and while the literature holds numerous accounts of its use, it is not known how often, in which streams of literature, and in what type of application the concept has been utilized by scholars since its introduction by Thomas Hughes in 1983. In this paper we employ bibliometric citation analysis between 1983 and 2008, inclusively, to study the impact of the reverse salient concept in the literature at large as well as study the dissemination of the concept into different fields of research. The study results show continuously growing number of concept citations in the literature over time as well as increasing concept diffusion into different research areas. The analysis of article contents additionally suggests the opportunity for scholars to engage in deeper conceptual application. Finally, the continuing increase in the number of citations highlights the importance of the reverse salient concept to scholars and practitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective This paper presents an automatic active learning-based system for the extraction of medical concepts from clinical free-text reports. Specifically, (1) the contribution of active learning in reducing the annotation effort, and (2) the robustness of incremental active learning framework across different selection criteria and datasets is determined. Materials and methods The comparative performance of an active learning framework and a fully supervised approach were investigated to study how active learning reduces the annotation effort while achieving the same effectiveness as a supervised approach. Conditional Random Fields as the supervised method, and least confidence and information density as two selection criteria for active learning framework were used. The effect of incremental learning vs. standard learning on the robustness of the models within the active learning framework with different selection criteria was also investigated. Two clinical datasets were used for evaluation: the i2b2/VA 2010 NLP challenge and the ShARe/CLEF 2013 eHealth Evaluation Lab. Results The annotation effort saved by active learning to achieve the same effectiveness as supervised learning is up to 77%, 57%, and 46% of the total number of sequences, tokens, and concepts, respectively. Compared to the Random sampling baseline, the saving is at least doubled. Discussion Incremental active learning guarantees robustness across all selection criteria and datasets. The reduction of annotation effort is always above random sampling and longest sequence baselines. Conclusion Incremental active learning is a promising approach for building effective and robust medical concept extraction models, while significantly reducing the burden of manual annotation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extended genetic linkage analysis - an analysis widely used in quantitative genetics - to 3D images to analyze single gene effects on brain fiber architecture. We collected 4 Tesla diffusion tensor images (DTI) and genotype data from 258 healthy adult twins and their non-twin siblings. After high-dimensional fluid registration, at each voxel we estimated the genetic linkage between the single nucleotide polymorphism (SNP), Val66Met (dbSNP number rs6265), of the BDNF gene (brain-derived neurotrophic factor) with fractional anisotropy (FA) derived from each subject's DTI scan, by fitting structural equation models (SEM) from quantitative genetics. We also examined how image filtering affects the effect sizes for genetic linkage by examining how the overall significance of voxelwise effects varied with respect to full width at half maximum (FWHM) of the Gaussian smoothing applied to the FA images. Raw FA maps with no smoothing yielded the greatest sensitivity to detect gene effects, when corrected for multiple comparisons using the false discovery rate (FDR) procedure. The BDNF polymorphism significantly contributed to the variation in FA in the posterior cingulate gyrus, where it accounted for around 90-95% of the total variance in FA. Our study generated the first maps to visualize the effect of the BDNF gene on brain fiber integrity, suggesting that common genetic variants may strongly determine white matter integrity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to explore the concept of service quality for settings where several customers are involved in the joint creation and consumption of a service. The approach is to provide first insights into the implications of a simultaneous multi‐customer integration on service quality. Design/methodology/approach This conceptual paper undertakes a thorough review of the relevant literature before developing a conceptual model regarding service co‐creation and service quality in customer groups. Findings Group service encounters must be set up carefully to account for the dynamics (social activity) in a customer group and skill set and capabilities (task activity) of each of the individual participants involved in a group service experience. Research limitations/implications Future research should undertake empirical studies to validate and/or modify the suggested model presented in this contribution. Practical implications Managers of service firms should be made aware of the implications and the underlying factors of group services in order to create and manage a group experience successfully. Particular attention should be given to those factors that can be influenced by service providers in managing encounters with multiple customers. Originality/value This article introduces a new conceptual approach for service encounters with groups of customers in a proposed service quality model. In particular, the paper focuses on integrating the impact of customers' co‐creation activities on service quality in a multiple‐actor model.