994 resultados para Continuous programming
Resumo:
The act of computer programming is generally considered to be temporally removed from a computer program’s execution. In this paper we discuss the idea of programming as an activity that takes place within the temporal bounds of a real-time computational process and its interactions with the physical world. We ground these ideas within the context of livecoding – a live audiovisual performance practice. We then describe how the development of the programming environment “Impromptu” has addressed our ideas of programming with time and the notion of the programmer as an agent in a cyber-physical system.
Resumo:
It is acknowledged around the world that many university students struggle with learning to program (McCracken et al., 2001; McGettrick et al., 2005). In this paper, we describe how we have developed a research programme to systematically study and incrementally improve our teaching. We have adopted a research programme with three elements: (1) a theory that provides an organising framework for defining the type of phenomena and data of interest, (2) data on how the class as a whole performs on formative assessment tasks that are framed from within the organising framework, and (3) data from one-on-one think aloud sessions, to establish why students struggle with some of those in-class formative assessment tasks. We teach introductory computer programming, but this three-element structure of our research is applicable to many areas of engineering education research.
Resumo:
Wires of YBa2Cu3O7-x were fabricated by extrusion using a hydroxypropyl methylcellulose (HPMC) binder. As little as 2 wt.% binder was added to an oxide prepared by a novel co-precipitation process, to produce a plastic mass which readily gave continuous extrusion of long lengths of wire in a reproducible fashion. Critical temperatures of 92K were obtained for wires given optimum high-temperature heat treatments. Critical current densities greater than 1000 A cm-1 were measured at 77.3K using heat treatments at around 910°C for 10h. These transport critical current densities, measured on centimeter-long wires, were obtained with microstructures showing a relatively dense and uniform distribution of randomly oriented, small YBa2Cu3O7-x grains. © 1993.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.
Resumo:
Since 1 December 2002, the New Zealand Exchange’s (NZX) continuous disclosure listing rules have operated with statutory backing. To test the effectiveness of the new corporate disclosure regime, we compare the change in quantity of market announcements (overall, non-routine, non-procedural and external) released to the NZX before and after the introduction of statutory backing. We also extend our study in investigating whether the effectiveness of the new corporate disclosure regime is diminished or augmented by corporate governance mechanisms including board size, providing separate roles for CEO and Chairman, board independence, board gender diversity and audit committee independence. Our findings provide a qualified support for the effectiveness of the new corporate disclosure regime regarding the quantity of market disclosures. There is strong evidence that the effectiveness of the new corporate disclosure regime was augmented by providing separate roles for CEO and Chairman, board gender diversity and audit committee independence, and diminished by board size. In addition, there is significant evidence that share price queries do impact corporate disclosure behaviour and this impact is significantly influenced by corporate governance mechanisms. Our findings provide important implications for corporate regulators in their quest for...
Resumo:
Since 1 December 2002, the New Zealand Exchange’s (NZX) continuous disclosure listing rules have operated with statutory backing. To test the effectiveness of the new corporate disclosure regime, we compare the change in quantity of market announcements (overall, non-routine, non-procedural and external) released to the NZX before and after the introduction of statutory backing. We also extend our study in investigating whether the effectiveness of the new corporate disclosure regime is diminished or augmented by corporate governance mechanisms including board size, providing separate roles for CEO and Chairman, board independence, board gender diversity and audit committee independence. Our findings provide a qualified support for the effectiveness of the new corporate disclosure regime regarding the quantity of market disclosures. There is strong evidence that the effectiveness of the new corporate disclosure regime was augmented by providing separate roles for CEO and Chairman, board gender diversity and audit committee independence, and diminished by board size. In addition, there is significant evidence that share price queries do impact corporate disclosure behaviour and this impact is significantly influenced by corporate governance mechanisms. Our findings provide important implications for corporate regulators in their quest for a superior disclosure regime.
Resumo:
User interfaces for source code editing are a crucial component in any software development environment, and in many editors visual annotations (overlaid on the textual source code) are used to provide important contextual information to the programmer. This paper focuses on the real-time programming activity of ‘cyberphysical’ programming, and considers the type of visual annotations which may be helpful in this programming context.
Resumo:
The Lake Wivenhoe Integrated Wireless Sensor Network is conceptually similar to traditional SCADA monitoring and control approaches. However, it is applied in an open system using wireless devices to monitor processes that affect water quality at both a high spatial and temporal frequency. This monitoring assists scientists to better understand drivers of key processes that influence water quality and provide the operators with an early warning system if below standard water enters the reservoir. Both of these aspects improve the safety and efficient delivery of drinking water to the end users.
Resumo:
Bi-2212 thick film on silver tapes are seen as a simple and low cost alternative to high temperature superconducting wires produced by the Powder In Thbe (PIT) technique, particularly in react and wind applications. A rig for the continuous production of Bi-2212 tapes for use in react and wind component manufacture has been developed and commissioned. The rig consists of several sections, each fully automatic, for task specific duties in the production of HTS tape. The major sections are: tape coating, sintering and annealing. High temperature superconducting tapes with engineering critical current densities of 10 kA/cm2 (77 K, self field), and lengths of up to 100 m have been produced using the rig. Properties of the finished tape are discussed and results are presented for current density versus bend radius and applied strain. Depending on tape content and thickness, irreversible strain tirrm varies between 0.04 and 0.1 %. Cyclic bending tests when applied strain does not exceed Eirrm showed negligible reduction in J c along the length of the tape.
Resumo:
This special issue of the Journal of Urban Technology brings together five articles that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer- Human Interaction conference (OZCHI 2009). Our own article introduces the Street Computing vision and explores the potential, challenges, and foundations of this research trajectory. In order to do so, we first look at the currently available sources of information and discuss their link to existing research efforts. Section 2 then introduces the notion of Street Computing and our research approach in more detail. Section 3 looks beyond the core concept itself and summarizes related work in this field of interest. We conclude by introducing the papers that have been contributed to this special issue.
Resumo:
Diesel particulate matter (DPM), in particular, has been likened in a somewhat inflammatory manner to be the ‘next asbestos’. From the business change perspective, there are three areas holding the industry back from fully engaging with the issue: 1. There is no real feedback loop in any operational sense to assess the impact of investment or application of controls to manage diesel emissions. 2. DPM are getting ever smaller and more numerous, but there is no practical way of measuring them to regulate them in the field. Mass, the current basis of regulation, is becoming less and less relevant. 3. Diesel emissions management is generally wholly viewed as a cost, yet there are significant areas of benefit available from good management. This paper discusses a feedback approach to address these three areas to move the industry forward. The six main areas of benefit from providing a feedback loop by continuously monitoring diesel emissions have been identified: 1. Condition-based maintenance. Emissions change instantaneously if engine condition changes. 2. Operator performance. An operator can use a lot more fuel for little incremental work output through poor technique or discipline. 3. Vehicle utilisation. Operating hours achieved and ratios of idling to under power affect the proportion of emissions produced with no economic value. 4. Fuel efficiency. This allows visibility into other contributing configuration and environmental factors for the vehicle. 5. Emission rates. This allows scope to directly address the required ratio of ventilation to diesel emissions. 6. Total carbon emissions - for NGER-type reporting requirements, calculating the emissions individually from each vehicle rather than just reporting on fuel delivered to a site.