186 resultados para Stamp collecting


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Researchers have found that transformational leadership is related to positive outcomes in educational institutions. Hence, it is important to explore constructs that may predict leadership style in order to identify potential transformational leaders in assessment and selection procedures. Several studies in non-educational settings have found that emotional intelligence is a useful predictor of transformational leadership, but these studies have generally lacked methodological rigor and contextual relevance. This project, set in Australian educational institutions, employed a more rigorous methodology to answer the question: to what extent is the Mayer and Salovey (1997) model of emotional intelligence a useful predictor of leadership style and perceived leadership outcomes? The project was designed to move research in the field forward by using valid and reliable instruments, controlling for other predictors, obtaining an adequately sized sample of current leaders and collecting multiple ratings of their leadership behaviours. The study (N = 144 leaders and 432 raters) results indicated that emotional intelligence was not a useful predictor of leadership style and perceived leadership outcomes. In contrast, several of the other predictors in the study were found to predict leadership style.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we describe a method to represent and discover adversarial group behavior in a continuous domain. In comparison to other types of behavior, adversarial behavior is heavily structured as the location of a player (or agent) is dependent both on their teammates and adversaries, in addition to the tactics or strategies of the team. We present a method which can exploit this relationship through the use of a spatiotemporal basis model. As players constantly change roles during a match, we show that employing a "role-based" representation instead of one based on player "identity" can best exploit the playing structure. As vision-based systems currently do not provide perfect detection/tracking (e.g. missed or false detections), we show that our compact representation can effectively "denoise" erroneous detections as well as enabe temporal analysis, which was previously prohibitive due to the dimensionality of the signal. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labelled data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses methodological developments in phenomenography that make it apropos for the study of teaching and learning to use information in educational environments. Phenomenography is typically used to analyze interview data to determine different ways of experiencing a phenomenon. There is an established tradition of phenomenographic research in the study of information literacy (ex: Bruce, 1997; 2008; Lupton, 2008; Webber, Boon, & Johnston, 2005). Drawing from the large body of evidence complied in two decades of research, phenomenographers developed variation theory, which explains what a learner can feasibly learn from a classroom lesson based on how the phenomenon being studied is presented (Marton, Runesson, & Tsui, 2004). Variation theory’s ability to establish the critical conditions necessary for learning to occur has resulted in the use of phenomenographic methods to study classroom interactions by collecting and analyzing naturalistic data through observation, as well as interviews concerning teachers’ intentions and students’ different experiences of classroom lessons. Describing the methodological developments of phenomenography in relation to understanding the classroom experience, this paper discusses the potential benefits and challenges of utilizing such methods to research the experiences of teaching and learning to use information in discipline-focused classrooms. The application of phenomenographic methodology for this purpose is exemplified with an ongoing study that explores how students learned to use information in an undergraduate language and gender course (Maybee, Bruce, Lupton, & Rebmann, in press). This paper suggests that by providing a nuanced understanding of what is intended for students to learn about using information, and relating that to what transpires in the classroom and how students experience these lessons, phenomenography and variation theory offer a viable framework for further understanding and improving how students are taught, and learn to use information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was composed in relation to the author's research of the popularity of themes of ephemerality and affect in recent global art. This focus correlated with Chicks on Speed's ongoing inquiries into issues of collections and collecting in the artworld, articulated as 'the art dump' by the group. This work was subsequently performed as a contribution to a performance with international multidisciplinary group Chicks on Speed as a part of their residency during MONA FOMA in Tasmania.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A food supply that delivers energy-dense products with high levels of salt, saturated fats and trans fats, in large portion sizes, is a major cause of non-communicable diseases (NCDs). The highly processed foods produced by large food corporations are primary drivers of increases in consumption of these adverse nutrients. The objective of this paper is to present an approach to monitoring food composition that can both document the extent of the problem and underpin novel actions to address it. The monitoring approach seeks to systematically collect information on high-level contextual factors influencing food composition and assess the energy density, salt, saturated fat, trans fats and portion sizes of highly processed foods for sale in retail outlets (with a focus on supermarkets and quick-service restaurants). Regular surveys of food composition are proposed across geographies and over time using a pragmatic, standardized methodology. Surveys have already been undertaken in several high- and middle-income countries, and the trends have been valuable in informing policy approaches. The purpose of collecting data is not to exhaustively document the composition of all foods in the food supply in each country, but rather to provide information to support governments, industry and communities to develop and enact strategies to curb food-related NCDs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Food labelling on food packaging has the potential to have both positive and negative effects on diets. Monitoring different aspects of food labelling would help to identify priority policy options to help people make healthier food choices. A taxonomy of the elements of health-related food labelling is proposed. A systematic review of studies that assessed the nature and extent of health-related food labelling has been conducted to identify approaches to monitoring food labelling. A step-wise approach has been developed for independently assessing the nature and extent of health-related food labelling in different countries and over time. Procedures for sampling the food supply, and collecting and analysing data are proposed, as well as quantifiable measurement indicators and benchmarks for health-related food labelling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a generic and integrated solar powered remote Unmanned Air Vehicles (UAV) and Wireless Sensor Network (WSN) gas sensing system. The system uses a generic gas sensing system for CH4 and CO2 concentrations using metal oxide (MoX) and non-dispersive infrared sensors, and a new solar cell encapsulation method to power the UASs as well as a data management platform to store, analyse and share the information with operators and external users. The system was successfully field tested at ground and low altitudes, collecting, storing and transmitting data in real time to a central node for analysis and 3D mapping. The system can be used in a wide range of outdoor applications, especially in agriculture, bushfires, mining studies, opening the way to a ubiquitous low cost environmental monitoring. A video of the bench and flight test performed can be seen in the following link https://www.youtube.com/watch?v=Bwas7stYIxQ.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – Traumatic events can cause post-traumatic stress disorder due to the severity of the often unexpected events. The purpose of this paper is to reveal how conversations around lived experiences of traumatic events, such as the Christchurch earthquake in February 2011, can work as a strategy for people to come to terms with their experiences collaboratively. By encouraging young children to recall and tell of their earthquake stories with their early childhood teachers they can begin to respond, renew, and recover (Brown, 2012), and prevent or minimise more stress being developed. Design/methodology/approach – The study involved collecting data of the participating children taking turns to wear a wireless microphone where their interactions with each other and with teachers were video recorded over one week in November 2011. A total of eight hours and 21 minutes of footage was collected; four minutes and 19 seconds of that footage are presented and analysed in this paper. The footage was watched repeatedly and transcribed using conversation analysis methods (Sacks, 1995). Findings – Through analysing the detailed turn-taking utterances between teachers and children, the orderliness of the co-production of remembering is revealed to demonstrate that each member orients to being in agreement about what actually happened. These episodes of story telling between the teachers and children demonstrate how the teachers encourage the children to tell about their experiences through actively engaging in conversations with them about the earthquake. Originality/value – The conversation analysis approach used in this research was found to be useful in investigating aspects of disasters that the participants themselves remember as important and real. This approach offers a unique insight into understanding how the earthquake event was experienced and reflected on by young children and their teachers, and so can inform future policy and provision in post-disaster situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article uses the Lavender Library, Archives, and Cultural Exchange of Sacramento, Incorporated, a small queer community archives in Northern California, as a case study for expanding our knowledge of community archives and issues of archival practice. It explores why creating a separate community archives was necessary, the role of community members in founding and maintaining the archives, the development of its collections, and the ongoing challenges community archives face. The article also considers the implications community archives have for professional practice, particularly in the areas of collecting, description, and collaboration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A large number of methods have been published that aim to evaluate various components of multi-view geometry systems. Most of these have focused on the feature extraction, description and matching stages (the visual front end), since geometry computation can be evaluated through simulation. Many data sets are constrained to small scale scenes or planar scenes that are not challenging to new algorithms, or require special equipment. This paper presents a method for automatically generating geometry ground truth and challenging test cases from high spatio-temporal resolution video. The objective of the system is to enable data collection at any physical scale, in any location and in various parts of the electromagnetic spectrum. The data generation process consists of collecting high resolution video, computing accurate sparse 3D reconstruction, video frame culling and down sampling, and test case selection. The evaluation process consists of applying a test 2-view geometry method to every test case and comparing the results to the ground truth. This system facilitates the evaluation of the whole geometry computation process or any part thereof against data compatible with a realistic application. A collection of example data sets and evaluations is included to demonstrate the range of applications of the proposed system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fusion techniques can be used in biometrics to achieve higher accuracy. When biometric systems are in operation and the threat level changes, controlling the trade-off between detection error rates can reduce the impact of an attack. In a fused system, varying a single threshold does not allow this to be achieved, but systematic adjustment of a set of parameters does. In this paper, fused decisions from a multi-part, multi-sample sequential architecture are investigated for that purpose in an iris recognition system. A specific implementation of the multi-part architecture is proposed and the effect of the number of parts and samples in the resultant detection error rate is analysed. The effectiveness of the proposed architecture is then evaluated under two specific cases of obfuscation attack: miosis and mydriasis. Results show that robustness to such obfuscation attacks is achieved, since lower error rates than in the case of the non-fused base system are obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the inception of the first Joint Registry in Sweden in 1979, many countries including Finland, Norway, Denmark, Australia, New Zealand, Canada, Scotland, England and Wales now have more than 10 years experience and data, and are collecting data on more than 90% of the procedures performed nationally. There are also Joint Registries in Romania, Slovakia, Slovenia, Croatia, Hungary, France, Germany, Switzerland, Czech Republic, Italy, Austria and Portugal, and work is ongoing to develop a Joint Registry in the US...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the experimental evaluation of a novel Autonomous Surface Vehicle capable of navigating complex inland water reservoirs and measuring a range of water quality properties and greenhouse gas emissions. The 16 ft long solar powered catamaran is capable of collecting water column profiles whilst in motion. It is also directly integrated with a reservoir scale floating sensor network to allow remote mission uploads, data download and adaptive sampling strategies. This paper describes the onboard vehicle navigation and control algorithms as well as obstacle avoidance strategies. Experimental results are shown demonstrating its ability to maintain track and avoid obstacles on a variety of large-scale missions and under differing weather conditions, as well as its ability to continuously collect various water quality parameters complimenting traditional manual monitoring campaigns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Environmental monitoring is becoming critical as human activity and climate change place greater pressures on biodiversity, leading to an increasing need for data to make informed decisions. Acoustic sensors can help collect data across large areas for extended periods making them attractive in environmental monitoring. However, managing and analysing large volumes of environmental acoustic data is a great challenge and is consequently hindering the effective utilization of the big dataset collected. This paper presents an overview of our current techniques for collecting, storing and analysing large volumes of acoustic data efficiently, accurately, and cost-effectively.