95 resultados para programming style
Resumo:
This special issue of the Journal of Urban Technology brings together five articles that are based on presentations given at the Street Computing workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer-Human Interaction conference (OZCHI 2009). Our own article introduces the Street Computing vision and explores the potential, challenges and foundations of this research vision. In order to do so, we first look at the currently available sources of information and discuss their link to existing research efforts. Section 2 then introduces the notion of Street Computing and our research approach in more detail. Section 3 looks beyond the core concept itself and summarises related work in this field of interest.
Resumo:
Feedback on student performance, whether in the classroom or on written assignments, enables them to reflect on their understandings and restructure their thinking in order to develop more powerful ideas and capabilities. Research has identified a number of broad principles of good feedback practice. These include the provision of feedback that facilitates the development of reflection in learning; helps clarify what good performance is in terms of goals, criteria and expected standards; provides opportunities to close the gap between current and desired performance; delivers high quality information to students about their learning; and encourages positive motivational beliefs and self-esteem. However, high staff–student ratios and time pressures often result in a gulf between this ideal and reality. Whilst greater use of criteria referenced assessment has enabled an improvement in the extent of feedback being provided to students, this measure alone does not go far enough to satisfy the requirements of good feedback practice. Technology offers an effective and efficient means by which personalised feedback may be provided to students. This paper presents the findings of a trial of the use of the freely available Audacity program to provide individual feedback via MP3 recordings to final year Media Law students at the Queensland University of Technology on their written assignments. The trial has yielded wide acclaim by students as an effective means of explaining the exact reasons why they received the marks they were awarded, the things they did well and the areas needing improvement. It also showed that good feedback practice can be achieved without the burden of an increase in staff workload.
Resumo:
Introduction: Feeding on demand supports an infant’s innate capacity to respond to hunger and satiety cues and may promote later self-regulation of intake. Our aim was to examine whether feeding style (on demand vs to schedule) is associated with weight gain in early life. Methods: Participants were first-time mothers of healthy term infants enrolled NOURISH, an RCT evaluating an intervention to promote positive early feeding practices. Baseline assessment occurred when infants were aged 2-7 months. Infants able to be categorised clearly as feeding on demand or to schedule (mothers self report) were included in the logistic regression analysis. The model was adjusted for gender, breastfeeding and maternal age, education, BMI. Weight gain was defined as a positive difference in baseline minus birthweight z-scores (WHO standards) which indicated tracking above weight percentile. Results: Data from 356 infants with a mean age of 4.4 (SD 1.0) months were available. Of these, 197 (55%) were fed on demand, 42 (12%) were fed on schedule. There was no statistical association between feeding style and weight gain [OR=0.72 (95%CI 0.35-1.46), P=0.36]. Formula fed infants were three times more likely to be fed on schedule and formula feeding was independently associated with increased weight gain [OR=2.02 (95%CI 1.11-3.66), P=0.021]. Conclusion: In this preliminary analysis the association between feeding style and weight gain did not reach statistical significance, however , the effect size may be clinically relevant and future analysis will include the full study sample (N=698).
Resumo:
Examining the late style of a writer is like skirting around quicksand. End-of-career reflection can subvert long standing critical accounts; revisionist publishing histories or newly minted archival work can do likewise. And, as Nancy J. Troy suggests, an artist’s last thoughts are rarely planned as such (15). In the case of Christina Stead any consideration of late style is made more difficult because, chronologically speaking, her ‘late’ works were written some 20 years before her death in 1983. Thus chronology can be deceptive, as Nicholas Delbanco points out in Lastingness: The Art of Old Age. Stead’s last novel, I’m Dying Laughing The Humourist, was completed, at least in rough draft form in 1966, when Stead was 64, but friends and readers suggested many changes. The book was published posthumously in 1986. Stead’s work is receiving increasing critical attention so a discussion of her ‘late style’ is important, particularly given that her fiction seems to refuse so many attempts at category-making. This perspective reveals two interesting aspects of her late work: first her consistent engagement with the problems of age for women, and in particular women writers, and second, the consequence of a life-long attention to the representation of dialogic sound in her novels, a preoccupation that results in what can be termed an aural signature. My discussion refers to Edward Said’s and Nicholas Delbanco’s ideas about late style by way of a focus on selective biographical issues and Stead’s engagement with radical politics before moving to an examination of what can be called an aural signature in several novels. Her fiction demonstrates one of the agreed markers of late style: she was constantly looking forward and looking back through innovation in form and content.
Resumo:
Programming is a subject that many beginning students find difficult. This paper describes a knowledge base designed for the purpose of analyzing programs written in the PHP web development language. The aim is to use this knowledge base in an Intelligent Tutoring System that will provide effective feedback to students. The main focus of this research is that a programming exercise can have many correct solutions. This paper presents an overview of how the proposed knowledge base can be utilized to accept different solutions to a given exercise
Resumo:
Proving security of cryptographic schemes, which normally are short algorithms, has been known to be time-consuming and easy to get wrong. Using computers to analyse their security can help to solve the problem. This thesis focuses on methods of using computers to verify security of such schemes in cryptographic models. The contributions of this thesis to automated security proofs of cryptographic schemes can be divided into two groups: indirect and direct techniques. Regarding indirect ones, we propose a technique to verify the security of public-key-based key exchange protocols. Security of such protocols has been able to be proved automatically using an existing tool, but in a noncryptographic model. We show that under some conditions, security in that non-cryptographic model implies security in a common cryptographic one, the Bellare-Rogaway model [11]. The implication enables one to use that existing tool, which was designed to work with a different type of model, in order to achieve security proofs of public-key-based key exchange protocols in a cryptographic model. For direct techniques, we have two contributions. The first is a tool to verify Diffie-Hellmanbased key exchange protocols. In that work, we design a simple programming language for specifying Diffie-Hellman-based key exchange algorithms. The language has a semantics based on a cryptographic model, the Bellare-Rogaway model [11]. From the semantics, we build a Hoare-style logic which allows us to reason about the security of a key exchange algorithm, specified as a pair of initiator and responder programs. The other contribution to the direct technique line is on automated proofs for computational indistinguishability. Unlike the two other contributions, this one does not treat a fixed class of protocols. We construct a generic formalism which allows one to model the security problem of a variety of classes of cryptographic schemes as the indistinguishability between two pieces of information. We also design and implement an algorithm for solving indistinguishability problems. Compared to the two other works, this one covers significantly more types of schemes, but consequently, it can verify only weaker forms of security.
Resumo:
The act of computer programming is generally considered to be temporally removed from a computer program's execution. In this paper we discuss the idea of programming as an activity that takes place within the temporal bounds of a real-time computational process and its interactions with the physical world. We ground these ideas within the con- text of livecoding -- a live audiovisual performance practice. We then describe how the development of the programming environment "Impromptu" has addressed our ideas of programming with time and the notion of the programmer as an agent in a cyber-physical system.
Resumo:
The act of computer programming is generally considered to be temporally removed from a computer program’s execution. In this paper we discuss the idea of programming as an activity that takes place within the temporal bounds of a real-time computational process and its interactions with the physical world. We ground these ideas within the context of livecoding – a live audiovisual performance practice. We then describe how the development of the programming environment “Impromptu” has addressed our ideas of programming with time and the notion of the programmer as an agent in a cyber-physical system.
Resumo:
It is acknowledged around the world that many university students struggle with learning to program (McCracken et al., 2001; McGettrick et al., 2005). In this paper, we describe how we have developed a research programme to systematically study and incrementally improve our teaching. We have adopted a research programme with three elements: (1) a theory that provides an organising framework for defining the type of phenomena and data of interest, (2) data on how the class as a whole performs on formative assessment tasks that are framed from within the organising framework, and (3) data from one-on-one think aloud sessions, to establish why students struggle with some of those in-class formative assessment tasks. We teach introductory computer programming, but this three-element structure of our research is applicable to many areas of engineering education research.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
Understanding the link between tectonic-driven extensional faulting and volcanism is crucial from a hazard perspective in active volcanic environments, while ancient volcanic successions provide records on how volcanic eruption styles, compositions, magnitudes and frequencies can change in response to extension timing, distribution and intensity. This study draws on intimate relationships of volcanism and extension preserved in the Sierra Madre Occidental (SMO) and Gulf of California (GoC) regions of western Mexico. Here, a major Oligocene rhyolitic ignimbrite “flare-up” (>300,000 km3) switched to a dominantly bimodal and mixed effusive-explosive volcanic phase in the Early Miocene (~100,000 km3), associated with distributed extension and opening of numerous grabens. Rhyolitic dome fields were emplaced along graben edges and at intersections of cross-graben and graben-parallel structures during early stages of graben development. Concomitant with this change in rhyolite eruption style was a change in crustal source as revealed by zircon chronochemistry with rapid rates of rhyolite magma generation due to remelting of mid- to upper crustal, highly differentiated igneous rocks emplaced during earlier SMO magmatism. Extension became more focused ~18 Ma resulting in volcanic activity being localised along the site of GoC opening. This localised volcanism (known as the Comondú “arc”) was dominantly effusive and andesite-dacite in composition. This compositional change resulted from increased mixing of basaltic and rhyolitic magmas rather than fluid flux melting of the mantle wedge above the subducting Guadalupe Plate. A poor understanding of space-time relationships of volcanism and extension has thus led to incorrect past tectonic interpretations of Comondú-age volcanism.
Resumo:
This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.