876 resultados para Computer software - Development
Resumo:
In this research paper, we study a simple programming problem that only requires knowledge of variables and assignment statements, and yet we found that some early novice programmers had difficulty solving the problem. We also present data from think aloud studies which demonstrate the nature of those difficulties. We interpret our data within a neo-Piagetian framework which describes cognitive developmental stages through which students pass as they learn to program. We describe in detail think aloud sessions with novices who reason at the neo-Piagetian preoperational level. Those students exhibit two problems. First, they focus on very small parts of the code and lose sight of the "big picture". Second, they are prone to focus on superficial aspects of the task that are not functionally central to the solution. It is not until the transition into the concrete operational stage that decentration of focus occurs, and they have the cognitive ability to reason about abstract quantities that are conserved, and are equipped to adapt skills to closely related tasks. Our results, and the neo-Piagetian framework on which they are based, suggest that changes are necessary in teaching practice to better support novices who have not reached the concrete operational stage.
Resumo:
Recent research from within a neo-Piagetian perspective proposes that novice programmers pass through the sensorimotor and preoperational stages before being able to reason at the concrete operational stage. However, academics traditionally teach and assess introductory programming as if students commence at the concrete operational stage. In this paper, we present results from a series of think aloud sessions with a single student, known by the pseudonym “Donald”. We conducted the sessions mainly over one semester, with an additional session three semesters later. Donald first manifested predominately sensorimotor reasoning, followed by preoperational reasoning, and finally concrete operational reasoning. This longitudinal think aloud study of Donald is the first direct observational evidence of a novice programmer progressing through the neo-Piagetian stages.
Resumo:
Recent studies have linked the ability of novice (CS1) programmers to read and explain code with their ability to write code. This study extends earlier work by asking CS2 students to explain object-oriented data structures problems that involve recursion. Results show a strong correlation between ability to explain code at an abstract level and performance on code writing and code reading test problems for these object-oriented data structures problems. The authors postulate that there is a common set of skills concerned with reasoning about programs that explains the correlation between writing code and explaining code. The authors suggest that an overly exclusive emphasis on code writing may be detrimental to learning to program. Non-code writing learning activities (e.g., reading and explaining code) are likely to improve student ability to reason about code and, by extension, improve student ability to write code. A judicious mix of code-writing and code-reading activities is recommended.
Resumo:
This article examines the design of ePortfolios for music postgraduate students utilizing a practice-led design iterative research process. It is suggested that the availability of Web 2.0 technologies such as blogs and social network software potentially provide creative artist with an opportunity to engage in a dialogue about art with artefacts of the artist products and processes present in that discussion. The design process applied Software Development as Research (SoDaR) methodology to simultaneously develop design and pedagogy. The approach to designing ePortfolio systems applied four theoretical protocols to examine the use of digitized artefacts to enable a dynamic and inclusive dialogue around representations of the students work. A negative case analysis identified a disjuncture between university access and control policy, and the relative openness of Web2.0 systems outside the institution that led to the design of an integrated model of ePortfolio.
Resumo:
Safety of repair, maintenance, alteration, and addition (RMAA) works have long been neglected because RMAAworks are often minute and only last for a short period of time. With rising importance of the RMAA sector in many developed societies, safety of RMAA works has begun to draw attention. Many RMAA contracting companies are small- and medium-sized enterprises (SMEs) that do not have comprehensive safety management systems. Existing safety legislation and regulations for new construction sites are not fully applicable to RMAAworks. Instead of relying on explicit and well-established safety systems, tacit safety knowledge plays an extremely important role in RMAA projects. To improve safety of RMAAworks, safety knowledge should be better managed. However, safety knowledge is difficult to capture in RMAA works. This study aims to examine safety management practices of RMAA contracting companies to see how safety knowledge of RMAA projects is managed. Findings show that RMAA contracting companies undertaking large-scale RMAA projects have more initiatives of safety management. Safety management of small-scale RMAA works relies heavily on the motivation of site supervisors and self-regulation of workers. Better tacit knowledge management improves safety performance. To enhance safety capability of RMAA contracting companies, a knowledge sharing culture should be cultivated. The government should provide assistance to SMEs to implement proper safety management practices in small-sized projects. Potentials of applying computer software technology in RMAA projects to capture, store, and retrieve safety information should be explored. Employees should be motivated to share safety knowledge by giving proper recognition to those who are willing to share.
Resumo:
There has been tremendous interest in watermarking multimedia content during the past two decades, mainly for proving ownership and detecting tamper. Digital fingerprinting, that deals with identifying malicious user(s), has also received significant attention. While extensive work has been carried out in watermarking of images, other multimedia objects still have enormous research potential. Watermarking database relations is one of the several areas which demand research focus owing to the commercial implications of database theft. Recently, there has been little progress in database watermarking, with most of the watermarking schemes modeled after the irreversible database watermarking scheme proposed by Agrawal and Kiernan. Reversibility is the ability to re-generate the original (unmarked) relation from the watermarked relation using a secret key. As explained in our paper, reversible watermarking schemes provide greater security against secondary watermarking attacks, where an attacker watermarks an already marked relation in an attempt to erase the original watermark. This paper proposes an improvement over the reversible and blind watermarking scheme presented in [5], identifying and eliminating a critical problem with the previous model. Experiments showing that the average watermark detection rate is around 91% even with attacker distorting half of the attributes. The current scheme provides security against secondary watermarking attacks.
Resumo:
This article explores how universities might engage more effectively with the imperative to develop students’ 21st century skills for the information society, by examining learning challenges and professional learning strategies of successful digital media professionals. The findings of qualitative interviews with professionals from Australian games, online publishing, apps and software development companies reinforce an increasing body of literature that suggests that legacy university structures and pedagogical approaches are not conducive to learning for professional capability in the digital age. Study participants were ambivalent about the value of higher education to digital careers, in general preferring a range of situated online and face-to-face social learning strategies for professional currency. This article draws upon the learning preferences of the professionals in this study to present a model of 21st century learning, as linked with extant theory relating to informal, self-determined learning and communities of practice.
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
Precise clock synchronization is essential in emerging time-critical distributed control systems operating over computer networks where the clock synchronization requirements are mostly focused on relative clock synchronization and high synchronization precision. Existing clock synchronization techniques such as the Network Time Protocol (NTP) and the IEEE 1588 standard can be difficult to apply to such systems because of the highly precise hardware clocks required, due to network congestion caused by a high frequency of synchronization message transmissions, and high overheads. In response, we present a Time Stamp Counter based precise Relative Clock Synchronization Protocol (TSC-RCSP) for distributed control applications operating over local-area networks (LANs). In our protocol a software clock based on the TSC register, counting CPU cycles, is adopted in the time clients and server. TSC-based clocks offer clients a precise, stable and low-cost clock synchronization solution. Experimental results show that clock precision in the order of 10~microseconds can be achieved in small-scale LAN systems. Such clock precision is much higher than that of a processor's Time-Of-Day clock, and is easily sufficient for most distributed real-time control applications over LANs.
Resumo:
Numeric sets can be used to store and distribute important information such as currency exchange rates and stock forecasts. It is useful to watermark such data for proving ownership in case of illegal distribution by someone. This paper analyzes the numerical set watermarking model presented by Sion et. al in “On watermarking numeric sets”, identifies it’s weaknesses, and proposes a novel scheme that overcomes these problems. One of the weaknesses of Sion’s watermarking scheme is the requirement to have a normally-distributed set, which is not true for many numeric sets such as forecast figures. Experiments indicate that the scheme is also susceptible to subset addition and secondary watermarking attacks. The watermarking model we propose can be used for numeric sets with arbitrary distribution. Theoretical analysis and experimental results show that the scheme is strongly resilient against sorting, subset selection, subset addition, distortion, and secondary watermarking attacks.
Resumo:
The increased interest in the area of process improvement persuaded Rabobank Group ICT in examining its own Change-process in order to improve its competitiveness. The group is looking for answers about the effectiveness of changes applied as part of this process, with particular interest toward the presence of predictive patterns and their parameters. We conducted an analysis of the log using well established process mining techniques (i.e. Fuzzy Miner). The results of the analysis conducted on the log of the process show that a visible impact is missing.
Resumo:
Helplines are services where callers can request help, advice, information, or support. While such help is usually offered through telephone helplines, web chat and email helplines are becoming increasingly available to members of the public. Helplines tend to offer specialized services, such as responding to computer software queries, or medical and health issues, or seeking information about natural disasters. Further, they may be aimed at particular populations such as children and young people. The earliest research investigating discourse in calls to helplines in social interactional research began in the 1960s with Sacks’ early work on calls to a suicide prevention center. Since then interactional research has produced a wealth of understandings into the mundane and institutional interactional practices through which help is sought and delivered. In addition to discussing the breadth of research into helplines, this entry explores the relationship between philosophies and interactional practices of helpline services.
Resumo:
Network Real-Time Kinematic (NRTK) is a technology that can provide centimeter-level accuracy positioning services in real time, and it is enabled by a network of Continuously Operating Reference Stations (CORS). The location-oriented CORS placement problem is an important problem in the design of a NRTK as it will directly affect not only the installation and operational cost of the NRTK, but also the quality of positioning services provided by the NRTK. This paper presents a Memetic Algorithm (MA) for the location-oriented CORS placement problem, which hybridizes the powerful explorative search capacity of a genetic algorithm and the efficient and effective exploitative search capacity of a local optimization. Experimental results have shown that the MA has better performance than existing approaches. In this paper we also conduct an empirical study about the scalability of the MA, effectiveness of the hybridization technique and selection of crossover operator in the MA.
Resumo:
Distributed computation and storage have been widely used for processing of big data sets. For many big data problems, with the size of data growing rapidly, the distribution of computing tasks and related data can affect the performance of the computing system greatly. In this paper, a distributed computing framework is presented for high performance computing of All-to-All Comparison Problems. A data distribution strategy is embedded in the framework for reduced storage space and balanced computing load. Experiments are conducted to demonstrate the effectiveness of the developed approach. They have shown that about 88% of the ideal performance capacity have be achieved in multiple machines through using the approach presented in this paper.