381 resultados para novice
Resumo:
This paper addresses the question of how teachers learn from experience during their pre-service course and early years of teaching. It outlines a theoretical framework that may help us better understand how teachers' professional identities emerge in practice. The framework adapts Vygotsky's Zone of Proximal Development, and Valsiner's Zone of Free Movement and Zone of Promoted Action, to the field of teacher education. The framework is used to analyse the pre-service and initial professional experiences of a novice secondary mathematics teacher in integrating computer and graphics calculator technologies into his classroom practice. (Contains 1 figure.) [For complete proceedings, see ED496848.]
Resumo:
Previous research on computers and graphics calculators in mathematics education has examined effects on curriculum content and students’ mathematical achievement and attitudes while less attention has been given to the relationship between technology use and issues of pedagogy, in particular the impact on teachers’ professional learning in specific classroom and school environments. This observation is critical in the current context of educational policy making, where it is assumed – often incorrectly – that supplying schools with hardware and software will increase teachers’ use of technology and encourage more innovative teaching approaches. This paper reports on a research program that aimed to develop better understanding of how and under what conditions Australian secondary school mathematics teachers learn to effectively integrate technology into their practice. The research adapted Valsiner’s concepts of the Zone of Proximal Development, Zone of Free Movement and Zone of Promoted Action to devise a theoretical framework for analysing relationships between factors influencing teachers’ use of technology in mathematics classrooms. This paper illustrates how the framework may be used by analysing case studies of a novice teacher and an experienced teacher in different school settings.
Resumo:
Fuzzy signal detection analysis can be a useful complementary technique to traditional signal detection theory analysis methods, particularly in applied settings. For example, traffic situations are better conceived as being on a continuum from no potential for hazard to high potential, rather than either having potential or not having potential. This study examined the relative contribution of sensitivity and response bias to explaining differences in the hazard perception performance of novices and experienced drivers, and the effect of a training manipulation. Novice drivers and experienced drivers were compared (N = 64). Half the novices received training, while the experienced drivers and half the novices remained untrained. Participants completed a hazard perception test and rated potential for hazard in occluded scenes. The response latency of participants to the hazard perception test replicated previous findings of experienced/novice differences and trained/untrained differences. Fuzzy signal detection analysis of both the hazard perception task and the occluded rating task suggested that response bias may be more central to hazard perception test performance than sensitivity, with trained and experienced drivers responding faster and with a more liberal bias than untrained novices. Implications for driver training and the hazard perception test are discussed.
Resumo:
New postgraduate students embark on their research journey typically with little or no experience in doing research. Supervisors and other more experienced student researchers might help them to find their feet during the first few weeks of their research by sharing their own experience of how they solved similar problems during their research. In this way each novice researcher can learn and benefit from other researchers´ ways of resolving problems. This paper discusses the real concerns that researchers reflected upon during a two-day research workshop, where researchers share problems, exchange ideas for overcoming them and learn from each other´s experiences of conducting research. The output from the workshop is in the form of hints and tips that can guide novice researchers when faced with initial problems. The paper can also be used by a department to induct a novice researcher into their environment.
Resumo:
Motion discontinuities can signal object boundaries where few or no other cues, such as luminance, colour, or texture, are available. Hence, motion-defined contours are an ecologically important counterpart to luminance contours. We developed a novel motion-defined Gabor stimulus to investigate the nature of neural operators analysing visual motion fields in order to draw parallels with known luminance operators. Luminance-defined Gabors have been successfully used to discern the spatial-extent and spatial-frequency specificity of possible visual contour detectors. We now extend these studies into the motion domain. We define a stimulus using limited-lifetime moving dots whose velocity is described over 2-D space by a Gabor pattern surrounded by randomly moving dots. Participants were asked to determine whether the orientation of the Gabor pattern (and hence of the motion contours) was vertical or horizontal in a 2AFC task, and the proportion of correct responses was recorded. We found that with practice participants became highly proficient at this task, able in certain cases to reach 90% accuracy with only 12 limited-lifetime dots. However, for both practised and novice participants we found that the ability to detect a single boundary saturates with the size of the Gaussian envelope of the Gabor at approximately 5 deg full-width at half-height. At this optimal size we then varied spatial frequency and found the optimum was at the lowest measured spatial frequency (0.1 cycle deg-1 ) and then steadily decreased with higher spatial frequencies, suggesting that motion contour detectors may be specifically tuned to a single, isolated edge.
Resumo:
Object-oriented programming is seen as a difficult skill to master. There is considerable debate about the most appropriate way to introduce novice programmers to object-oriented concepts. Is it possible to uncover what the critical aspects or features are that enhance the learning of object-oriented programming? Practitioners have differing understandings of the nature of an object-oriented program. Uncovering these different ways of understanding leads to agreater understanding of the critical aspects and their relationship tothe structure of the program produced. A phenomenographic studywas conducted to uncover practitioner understandings of the nature of an object-oriented program. The study identified five levels of understanding and three dimensions of variation within these levels. These levels and dimensions of variation provide a framework for fostering conceptual change with respect to the nature of an object-oriented program.
Resumo:
This thesis initially presents an 'assay' of the literature pertaining to individual differences in human-computer interaction. A series of experiments is then reported, designed to investigate the association between a variety of individual characteristics and various computer task and interface factors. Predictor variables included age, computer expertise, and psychometric tests of spatial visualisation, spatial memory, logical reasoning, associative memory, and verbal ability. These were studied in relation to a variety of computer-based tacks, including: (1) word processing and its component elements; (ii) the location of target words within passages of text; (iii) the navigation of networks and menus; (iv) command generation using menus and command line interfaces; (v) the search and selection of icons and text labels; (vi) information retrieval. A measure of self-report workload was also included in several of these experiments. The main experimental findings included: (i) an interaction between spatial ability and the manipulation of semantic but not spatial interface content; (ii) verbal ability being only predictive of certain task components of word processing; (iii) age differences in word processing and information retrieval speed but not accuracy; (iv) evidence of compensatory strategies being employed by older subjects; (v) evidence of performance strategy differences which disadvantaged high spatial subjects in conditions of low spatial information content; (vi) interactive effects of associative memory, expertise and command strategy; (vii) an association between logical reasoning and word processing but not information retrieval; (viii) an interaction between expertise and cognitive demand; and (ix) a stronger association between cognitive ability and novice performance than expert performance.
Resumo:
This thesis investigates how people select items from a computer display using the mouse input device. The term computer mouse refers to a class of input devices which share certain features, but these may have different characteristics which influence the ways in which people use the device. Although task completion time is one of the most commonly used performance measures for input device evaluation, there is no consensus as to its definition. Furthermore most mouse studies fail to provide adequate assurances regarding its correct measurement.Therefore precise and accurate timing software were developed which permitted the recording of movement data which by means of automated analysis yielded the device movements made. Input system gain, an important task parameter, has been poorly defined and misconceptualized in most previous studies. The issue of gain has been clarified and investigated within this thesis. Movement characteristics varied between users and within users, even for the same task conditions. The variables of target size, movement amplitude, and experience exerted significant effects on performance. Subjects consistently undershot the target area. This may be a consequence of the particular task demands. Although task completion times indicated that mouse performance had stabilized after 132 trials the movement traces, even of very experienced users, indicated that there was still considerable room for improvement in performance, as indicated by the proportion of poorly made movements. The mouse input device was suitable for older novice device users, but they took longer to complete the experimental trials. Given the diversity and inconsistency of device movements, even for the same task conditions, caution is urged when interpreting averaged grouped data. Performance was found to be sensitive to; task conditions, device implementations, and experience in ways which are problematic for the theoretical descriptions of device movement, and limit the generalizability of such findings within this thesis.
Resumo:
The fMRI Experience began as a postgraduate organised conference, to enable novice access to expertise in a developing and technically complex area, and for mutual support. This article investigates the seventh annual iteration of this emergent conference and evaluates its educational value. Key features are free attendance supported by sponsorship, a clear focus on student needs and a strong social programme and participation ethos to facilitate interaction. Predominantly qualitative data suggests that the event is of value to postgraduate participants and is also successful in attracting the participation of internationally leading researchers. The implications and value of the event for postgraduate education and for developing new fields of enquiry are discussed.
Resumo:
Purpose – There appears to be an ever-insatiable demand from markets for organisations to improve their products and services. To meet this, there is a need to provide business process improvement (BPI) methodologies that are holistic, structured and procedural. Therefore, this paper describes research that has formed and tested a generic and practical methodology termed model-based and integrated process improvement (MIPI) to support the implementation of BPI; and to validate its effectiveness in organisations. This methodology has been created as an aid for practitioners within organisations. Design/methodology/approach – The research objectives were achieved by: reviewing and analysing current methodologies, and selecting a few frameworks against key performance indicators. Using a refined Delphi approach and semi-structured interview with the “experts” in the field. Intervention, case study and process research approach to evaluating a methodology. Findings – The BPI methodology was successfully formed and applied by the researcher and directly by the companies involved against the criteria of feasibility, usability and usefulness. Research limitations/implications – The paper has demonstrated a new knowledge on how to systematically assess a BPI methodology in practice. Practical implications – Model-based and integrated process improvement methodology (MIPI) methodology offers the practitioner (experienced and novice) a set of step-by-step aids necessary to make informed, consistent and efficient changes to business processes. Originality/value – The novelty of this research work is the creation of a holistic workbook-based methodology with relevant tools and techniques. It extends the capabilities of existing methodologies.
Resumo:
Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the 'COOPER-framework' a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly. © 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.
Resumo:
Purpose: Development and evaluation of a prototype dialogue game for servitization is reported. Design/methodology/approach: This paper reports the design of the iServe game, from user centered design, through implementation using the Unity games engine to evaluation, a process which took 270 researcher hours. Findings: No relationship was found between either age or gaming experience and usability. Participants who identified themselves as non-experts in servitization recognized the potential of the game to teach servitization concepts to other novice learners. Originality/value: The potential of business games for education and executive development has been recognized but factors, including high development cost, inhibit their uptake. Games engines offer a potential solution.
Resumo:
Purpose – The purpose of this paper is to outline a seven-phase simulation conceptual modelling procedure that incorporates existing practice and embeds a process reference model (i.e. SCOR). Design/methodology/approach – An extensive review of the simulation and SCM literature identifies a set of requirements for a domain-specific conceptual modelling procedure. The associated design issues for each requirement are discussed and the utility of SCOR in the process of conceptual modelling is demonstrated using two development cases. Ten key concepts are synthesised and aligned to a general process for conceptual modelling. Further work is outlined to detail, refine and test the procedure with different process reference models in different industrial contexts. Findings - Simulation conceptual modelling is often regarded as the most important yet least understood aspect of a simulation project (Robinson, 2008a). Even today, there has been little research development into guidelines to aid in the creation of a conceptual model. Design issues are discussed for building an ‘effective’ conceptual model and the domain-specific requirements for modelling supply chains are addressed. The ten key concepts are incorporated to aid in describing the supply chain problem (i.e. components and relationships that need to be included in the model), model content (i.e. rules for determining the simplest model boundary and level of detail to implement the model) and model validation. Originality/value – Paper addresses Robinson (2008a) call for research in defining and developing new approaches for conceptual modelling and Manuj et al., (2009) discussion on improving the rigour of simulation studies in SCM. It is expected that more detailed guidelines will yield benefits to both expert (i.e. avert typical modelling failures) and novice modellers (i.e. guided practice; less reliance on hopeful intuition)
Resumo:
In the computer science community, there is considerable debate about the appropriate sequence for introducing object-oriented concepts to novice programmers. Research into novice programming has struggled to identify the critical aspects that would provide a consistently successful approach to teaching introductory object-oriented programming. Starting from the premise that the conceptions of a task determine the type of output from the task, assisting novice programmers to become aware of what the required output should be, may lay a foundation for improving learning. This study adopted a phenomenographic approach. Thirty one practitioners were interviewed about the ways in which they experience object-oriented programming and categories of description and critical aspects were identified. These critical aspects were then used to examine the spaces of learning provided in twenty introductory textbooks. The study uncovered critical aspects that related to the way that practitioners expressed their understanding of an object-oriented program and the influences on their approach to designing programs. The study of the textbooks revealed a large variability in the cover of these critical aspects.