412 resultados para computer technology enhanced pedagogy
Resumo:
The current understanding of students’ group metacognition is limited. The research on metacognition has focused mainly on the individual student. The aim of this study was to address the void by developing a conceptual model to inform the use of scaffolds to facilitate group metacognition during mathematical problem solving in computer supported collaborative learning (CSCL) environments. An initial conceptual framework based on the literature from metacognition, cooperative learning, cooperative group metacognition, and computer supported collaborative learning was used to inform the study. In order to achieve the study aim, a design research methodology incorporating two cycles was used. The first cycle focused on the within-group metacognition for sixteen groups of primary school students working together around the computer; the second cycle included between-group metacognition for six groups of primary school students working together on the Knowledge Forum® CSCL environment. The study found that providing groups with group metacognitive scaffolds resulted in groups planning, monitoring, and evaluating the task and team aspects of their group work. The metacognitive scaffolds allowed students to focus on how their group was completing the problem-solving task and working together as a team. From these findings, a revised conceptual model to inform the use of scaffolds to facilitate group metacognition during mathematical problem solving in computer supported collaborative learning (CSCL) environments was generated.
Resumo:
Computer tomography has been used to image and reconstruct in 3-D an Egyptian mummy from the collection of the British Museum. This study of Tjentmutengebtiu, a priestess from the 22nd dynasty (945-715 BC) revealed invaluable information of a scientific, Egyptological and palaeopathological nature without mutilation and destruction of the painted cartonnage case or linen wrappings. Precise details on the removal of the brain through the nasal cavity and the viscera from the abdominal cavity were obtained. The nature and composition of the false eyes were investigated. The detailed analysis of the teeth provided a much closer approximation of age at death. The identification of materials used for the various amulets including that of the figures placed in the viscera was graphically demonstrated using this technique.
Resumo:
The consistently high failure rate in Queensland University of Technology’s introductory programming subject reflects a similar dilemma facing other universities worldwide. Experiments were conducted to quantify the effectiveness of collaborative learning on introductory level programming students over a number of semesters, replicating previous studies in this area. A selection of workshops in the introductory programming subject required students to problem-solve and program in pairs, mimicking the eXtreme Programming concept of pair programming. The failure rate for the subject fell from what had been an average of 30% since 2003 (with a high of 41% in 2006), to just 5% for those students who worked consistently in pairs.
Resumo:
We consider a new form of authenticated key exchange which we call multi-factor password-authenticated key exchange, where session establishment depends on successful authentication of multiple short secrets that are complementary in nature, such as a long-term password and a one-time response, allowing the client and server to be mutually assured of each other's identity without directly disclosing private information to the other party. Multi-factor authentication can provide an enhanced level of assurance in higher-security scenarios such as online banking, virtual private network access, and physical access because a multi-factor protocol is designed to remain secure even if all but one of the factors has been compromised. We introduce a security model for multi-factor password-authenticated key exchange protocols, propose an efficient and secure protocol called MFPAK, and provide a security argument to show that our protocol is secure in this model. Our security model is an extension of the Bellare-Pointcheval-Rogaway security model for password-authenticated key exchange and accommodates an arbitrary number of symmetric and asymmetric authentication factors.
Resumo:
To meet new challenges of Enterprise Systems that essentially go beyond the initial implementation, contemporary organizations seek employees with business process experts with software skills. Despite a healthy demand from the industry for such expertise, recent studies reveal that most Information Systems (IS) graduates are ill-equipped to meet the challenges of modern organizations. This paper shares insights and experiences from a course that is designed to provide a business process centric view of a market leading Enterprise System. The course, designed for both undergraduate and graduate students, uses two common business processes in a case study that employs both sequential and explorative exercises. Student feedback gained through two longitudinal surveys across two phases of the course demonstrates promising signs of the teaching approach.
Resumo:
Few frameworks exist for the teaching and assessment of programming subjects that are coherent and logical. Nor are they sufficiently generic and adaptable to be used outside the particular tertiary institutions in which they were developed. This paper presents the Teaching and Assessment of Software Development (TASD) frame-work. We describe its development and implementation at an Australian university and demonstrate, with examples, how it has been used, with supporting data. Extracts of criteria sheets (grading rubrics) for a variety of assessment tasks are included. The numerous advantages of this new framework are discussed with comparisons made to those reported in the published literature.
Resumo:
Computer forensics is the process of gathering and analysing evidence from computer systems to aid in the investigation of a crime. Typically, such investigations are undertaken by human forensic examiners using purpose-built software to discover evidence from a computer disk. This process is a manual one, and the time it takes for a forensic examiner to conduct such an investigation is proportional to the storage capacity of the computer's disk drives. The heterogeneity and complexity of various data formats stored on modern computer systems compounds the problems posed by the sheer volume of data. The decision to undertake a computer forensic examination of a computer system is a decision to commit significant quantities of a human examiner's time. Where there is no prior knowledge of the information contained on a computer system, this commitment of time and energy occurs with little idea of the potential benefit to the investigation. The key contribution of this research is the design and development of an automated process to describe a computer system and its activity for the purposes of a computer forensic investigation. The term proposed for this process is computer profiling. A model of a computer system and its activity has been developed over the course of this research. Using this model a computer system, which is the subj ect of investigation, can be automatically described in terms useful to a forensic investigator. The computer profiling process IS resilient to attempts to disguise malicious computer activity. This resilience is achieved by detecting inconsistencies in the information used to infer the apparent activity of the computer. The practicality of the computer profiling process has been demonstrated by a proof-of concept software implementation. The model and the prototype implementation utilising the model were tested with data from real computer systems. The resilience of the process to attempts to disguise malicious activity has also been demonstrated with practical experiments conducted with the same prototype software implementation.
Resumo:
In this paper, technology is described as involving processes whereby resources are utilised to satisfy human needs or to take advantage of opportunities, to develop practical solutions to problems. This study, set within one type of technology context, information technology, investigated how, through a one semester undergraduate university course, elements of technological processes were made explicit to students. While it was acknowledged in the development and implementation of this course that students needed to learn technical skills, technological skills and knowledge, including design, were seen as vital also, to enable students to think about information technology from a perspective that was not confined and limited to `technology as hardware and software'. This paper describes how the course, set within a three year program of study, was aimed at helping students to develop their thinking and their knowledge about design processes in an explicit way. An interpretive research approach was used and data sources included a repertory grid `survey'; student interviews; video recordings of classroom interactions, audio recordings of lectures, observations of classroom interactions made by researchers; and artefacts which included students' journals and portfolios. The development of students' knowledge about design practices is discussed and reflections upon student knowledge development in conjunction with their learning experiences are made. Implications for ensuring explicitness of design practice within information technology contexts are presented, and the need to identify what constitutes design knowledge is argued.
Resumo:
Digital forensics investigations aim to find evidence that helps confirm or disprove a hypothesis about an alleged computer-based crime. However, the ease with which computer-literate criminals can falsify computer event logs makes the prosecutor's job highly challenging. Given a log which is suspected to have been falsified or tampered with, a prosecutor is obliged to provide a convincing explanation for how the log may have been created. Here we focus on showing how a suspect computer event log can be transformed into a hypothesised actual sequence of events, consistent with independent, trusted sources of event orderings. We present two algorithms which allow the effort involved in falsifying logs to be quantified, as a function of the number of `moves' required to transform the suspect log into the hypothesised one, thus allowing a prosecutor to assess the likelihood of a particular falsification scenario. The first algorithm always produces an optimal solution but, for reasons of efficiency, is suitable for short event logs only. To deal with the massive amount of data typically found in computer event logs, we also present a second heuristic algorithm which is considerably more efficient but may not always generate an optimal outcome.