983 resultados para Multiple datasets
Resumo:
This thesis reports on a multiple case study of the actions of three Queensland secondary schools in the context of Year 9 NAPLAN numeracy testing, focusing on their administrative practices, curriculum, pedagogy and assessment. It was established that schools have found it both challenging and costly to operate in an environment of educational reform generally, and NAPLAN testing in particular. The lack of a common understanding of numeracy and the substantial demands of implementing the Australian Curriculum have impacted on schools' ability to prepare students appropriately for NAPLAN numeracy tests. It was concluded that there is scope for schools to improve their approaches to NAPLAN numeracy testing in a way that maximises learning as well as test outcomes.
Resumo:
This paper explores the concept that individual dancers leave traces in a choreographer’s body of work and similarly, that dancers carry forward residue of embodied choreographies into other working processes. This presentation will be grounded in a study of the multiple iterations of a programme of solo works commissioned in 2008 from choreographers John Jasperse, Jodi Melnick, Liz Roche and Rosemary Butcher and danced by the author. This includes an exploration of the development by John Jasperse of themes from his solo into the pieces PURE (2008) and Truth, Revised Histories, Wishful Thinking and Flat Out Lies (2009); an adaptation of the solo Business of the Bloom by Jodi Melnick in 2008 and a further adaptation of Business of the Bloom by this author in 2012. It will map some of the developments that occurred through a number of further performances over five years of the solo Shared Material on Dying by Liz Roche and the working process of the (uncompleted) solo Episodes of Flight by Rosemary Butcher. The purpose is to reflect back on authorship in dance, an art form in which lineages of influence can often be clearly observed. Normally, once a choreographic work is created and performed, it is archived through video recording, notation and/or reviews. The dancer is no longer called upon to represent the dance piece within the archive and thus her/his lived presence and experiential perspective disappears. The author will draw on the different traces still inhabiting her body as pathways towards understanding how choreographic movement circulates beyond this moment of performance. This will include the interrogation of ownership of choreographic movement, as once it becomes integrated in the body of the dancer, who owns the dance? Furthermore, certain dancers, through their individual physical characteristics and moving identities, can deeply influence the formation of choreographic signatures, a proposition that challenges the sole authorship role of the choreographer in dance production. This paper will be delivered in a presentation format that will bleed into movement demonstrations alongside video footage of the works and auto-ethnographic accounts of dancing experience. A further source of knowledge will be drawn from extracts of interviews with other dancers including Sara Rudner, Rebecca Hilton and Catherine Bennett.
Resumo:
This paper presents an overview of the strengths and limitations of existing and emerging geophysical tools for landform studies. The objectives are to discuss recent technical developments and to provide a review of relevant recent literature, with a focus on propagating field methods with terrestrial applications. For various methods in this category, including ground-penetrating radar (GPR), electrical resistivity (ER), seismics, and electromagnetic (EM) induction, the technical backgrounds are introduced, followed by section on novel developments relevant to landform characterization. For several decades, GPR has been popular for characterization of the shallow subsurface and in particular sedimentary systems. Novel developments in GPR include the use of multi-offset systems to improve signal-to-noise ratios and data collection efficiency, amongst others, and the increased use of 3D data. Multi-electrode ER systems have become popular in recent years as they allow for relatively fast and detailed mapping. Novel developments include time-lapse monitoring of dynamic processes as well as the use of capacitively-coupled systems for fast, non-invasive surveys. EM induction methods are especially popular for fast mapping of spatial variation, but can also be used to obtain information on the vertical variation in subsurface electrical conductivity. In recent years several examples of the use of plane wave EM for characterization of landforms have been published. Seismic methods for landform characterization include seismic reflection and refraction techniques and the use of surface waves. A recent development is the use of passive sensing approaches. The use of multiple geophysical methods, which can benefit from the sensitivity to different subsurface parameters, is becoming more common. Strategies for coupled and joint inversion of complementary datasets will, once more widely available, benefit the geophysical study of landforms.Three cases studies are presented on the use of electrical and GPR methods for characterization of landforms in the range of meters to 100. s of meters in dimension. In a study of polygonal patterned ground in the Saginaw Lowlands, Michigan, USA, electrical resistivity tomography was used to characterize differences in subsurface texture and water content associated with polygon-swale topography. Also, a sand-filled thermokarst feature was identified using electrical resistivity data. The second example is on the use of constant spread traversing (CST) for characterization of large-scale glaciotectonic deformation in the Ludington Ridge, Michigan. Multiple CST surveys parallel to an ~. 60. m high cliff, where broad (~. 100. m) synclines and narrow clay-rich anticlines are visible, illustrated that at least one of the narrow structures extended inland. A third case study discusses internal structures of an eolian dune on a coastal spit in New Zealand. Both 35 and 200. MHz GPR data, which clearly identified a paleosol and internal sedimentary structures of the dune, were used to improve understanding of the development of the dune, which may shed light on paleo-wind directions.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Welcome to the Evaluation of course matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systemic manner. The primary purpose of the Evaluation of course matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being ready for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation. I hope you find this tool useful in your assessment review.
Resumo:
The autonomous capabilities in collaborative unmanned aircraft systems are growing rapidly. Without appropriate transparency, the effectiveness of the future multiple Unmanned Aerial Vehicle (UAV) management paradigm will be significantly limited by the human agent’s cognitive abilities; where the operator’s CognitiveWorkload (CW) and Situation Awareness (SA) will present as disproportionate. This proposes a challenge in evaluating the impact of robot autonomous capability feedback, allowing the human agent greater transparency into the robot’s autonomous status - in a supervisory role. This paper presents; the motivation, aim, related works, experiment theory, methodology, results and discussions, and the future work succeeding this preliminary study. The results in this paper illustrates that, with a greater transparency of a UAV’s autonomous capability, an overall improvement in the subjects’ cognitive abilities was evident, that is, with a confidence of 95%, the test subjects’ mean CW was demonstrated to have a statistically significant reduction, while their mean SA was demonstrated to have a significant increase.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
Player experiences and expectations are connected. The presumptions players have about how they control their gameplay interactions may shape the way they play and perceive videogames. A successfully engaging player experience might rest on the way controllers meet players' expectations. We studied player interaction with novel controllers on the Sony PlayStation Wonderbook, an augmented reality (AR) gaming system. Our goal was to understand player expectations regarding game controllers in AR game design. Based on this preliminary study, we propose several interaction guidelines for hybrid input from both augmented reality and physical game controllers
Resumo:
This thesis in software engineering presents a novel automated framework to identify similar operations utilized by multiple algorithms for solving related computing problems. It provides a new effective solution to perform multi-application based algorithm analysis, employing fundamentally light-weight static analysis techniques compared to the state-of-art approaches. Significant performance improvements are achieved across the objective algorithms through enhancing the efficiency of the identified similar operations, targeting discrete application domains.
Resumo:
The purpose of this study was to examine the main and interactive effects of four dimensions of professional commitment on strain (i.e., depression, anxiety, perceived health status, and job dissatisfaction) for a sample of 176 law professionals. The study utilized a two-wave design in which professional commitment and strain were measured at Time 1 and strain was measured again at Time 2 (T2), 2 months later. A significant two-way interaction indicated that high affective commitment was related to less T2 job dissatisfaction only for lawyers with low accumulated costs. A significant four-way interaction indicated that high affective professional commitment was only related to fewer symptoms of T2 anxiety for lawyers with high normative professional commitment and both low limited alternatives and accumulated costs. A similar pattern of results emerged in regard to T2 perceived health status. The theoretical and practical implications of these results for career counselors are discussed.
Resumo:
Discounted Cumulative Gain (DCG) is a well-known ranking evaluation measure for models built with multiple relevance graded data. By handling tagging data used in recommendation systems as an ordinal relevance set of {negative,null,positive}, we propose to build a DCG based recommendation model. We present an efficient and novel learning-to-rank method by optimizing DCG for a recommendation model using the tagging data interpretation scheme. Evaluating the proposed method on real-world datasets, we demonstrate that the method is scalable and outperforms the benchmarking methods by generating a quality top-N item recommendation list.
Resumo:
Student perceptions of teaching have often been used in tertiary education for evaluation purposes. However, there is a paucity of research on the validity, reliability, and applicability of instruments that cover a wide range of student perceptions of pedagogies and practices in high school settings for descriptive purposes. The study attempts to validate an inventory of pedagogy and practice (IPP) that provides researchers and practitioners with a psychometrically sound instrument that covers the most salient factors related to teaching. Using a sample of students (N = 1515) from 39 schools in Singapore, 14 factors about teaching in English lessons from the students’ perspective were tested with confirmatory factor analysis (classroom task goal, structure and clarity, curiosity and interest, positive class climate, feedback, questioning, quality homework, review of students’ work, conventional teaching, exam preparation, behaviour management, maximizing learning time, student-centred pedagogy, and subject domain teaching). Two external criterion factors were used to further test the IPP factor structure. The inventory will enable teachers to understand more about their teaching and researchers to examine how teaching may be related to learning outcomes.
Resumo:
Background Dementia is a chronic illness without cure or effective treatment, which results in declining mental and physical function and assistance from others to manage activities of daily living. Many people with dementia live in long term care facilities, yet research into their quality of life (QoL) was rare until the last decade. Previous studies failed to incorporate important variables related to the facility and care provision or to look closely at the daily lives of residents. This paper presents a protocol for a comprehensive, multi-perspective assessment of QoL of residents with dementia living in long term care in Australia. A secondary aim is investigating the effectiveness of self-report instruments for measuring QoL. Methods The study utilizes a descriptive, mixed methods design to examine how facility, care staff, and resident factors impact QoL. Over 500 residents with dementia from a stratified, random sample of 53 facilities are being recruited. A sub-sample of 12 residents is also taking part in qualitative interviews and observations. Conclusions This national study will provide a broad understanding of factors underlying QoL for residents with dementia in long term care. The present study uses a similar methodology to the US-based Collaborative Studies of Long Term Care (CS-LTC) Dementia Care Study, applying it to the Australian setting.