982 resultados para second programming course


Relevância:

40.00% 40.00%

Publicador:

Resumo:

With: A Practical application of The Principles of geometry to the mensuration of superficies and solids: being the third part of a Course of mathematics, ...

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The articles on electricity and magnetism were selected form Biot's Précis élémentaire de physique ; that on electro-dynamics is from Despretz's Traité élémentaire de physique, 4. éd., 1836.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract: Quantitative Methods (QM) is a compulsory course in the Social Science program in CEGEP. Many QM instructors assign a number of homework exercises to give students the opportunity to practice the statistical methods, which enhances their learning. However, traditional written exercises have two significant disadvantages. The first is that the feedback process is often very slow. The second disadvantage is that written exercises can generate a large amount of correcting for the instructor. WeBWorK is an open-source system that allows instructors to write exercises which students answer online. Although originally designed to write exercises for math and science students, WeBWorK programming allows for the creation of a variety of questions which can be used in the Quantitative Methods course. Because many statistical exercises generate objective and quantitative answers, the system is able to instantly assess students’ responses and tell them whether they are right or wrong. This immediate feedback has been shown to be theoretically conducive to positive learning outcomes. In addition, the system can be set up to allow students to re-try the problem if they got it wrong. This has benefits both in terms of student motivation and reinforcing learning. Through the use of a quasi-experiment, this research project measured and analysed the effects of using WeBWorK exercises in the Quantitative Methods course at Vanier College. Three specific research questions were addressed. First, we looked at whether students who did the WeBWorK exercises got better grades than students who did written exercises. Second, we looked at whether students who completed more of the WeBWorK exercises got better grades than students who completed fewer of the WeBWorK exercises. Finally, we used a self-report survey to find out what students’ perceptions and opinions were of the WeBWorK and the written exercises. For the first research question, a crossover design was used in order to compare whether the group that did WeBWorK problems during one unit would score significantly higher on that unit test than the other group that did the written problems. We found no significant difference in grades between students who did the WeBWorK exercises and students who did the written exercises. The second research question looked at whether students who completed more of the WeBWorK exercises would get significantly higher grades than students who completed fewer of the WeBWorK exercises. The straight-line relationship between number of WeBWorK exercises completed and grades was positive in both groups. However, the correlation coefficients for these two variables showed no real pattern. Our third research question was investigated by using a survey to elicit students’ perceptions and opinions regarding the WeBWorK and written exercises. Students reported no difference in the amount of effort put into completing each type of exercise. Students were also asked to rate each type of exercise along six dimensions and a composite score was calculated. Overall, students gave a significantly higher score to the written exercises, and reported that they found the written exercises were better for understanding the basic statistical concepts and for learning the basic statistical methods. However, when presented with the choice of having only written or only WeBWorK exercises, slightly more students preferred or strongly preferred having only WeBWorK exercises. The results of this research suggest that the advantages of using WeBWorK to teach Quantitative Methods are variable. The WeBWorK system offers immediate feedback, which often seems to motivate students to try again if they do not have the correct answer. However, this does not necessarily translate into better performance on the written tests and on the final exam. What has been learned is that the WeBWorK system can be used by interested instructors to enhance student learning in the Quantitative Methods course. Further research may examine more specifically how this system can be used more effectively.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Trabalho apresentado em PAEE/ALE’2016, 8th International Symposium on Project Approaches in Engineering Education (PAEE) and 14th Active Learning in Engineering Education Workshop (ALE)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The economic occupation of an area of 500 ha for Piracicaba was studied with the irrigated cultures of maize, tomato, sugarcane and beans, having used models of deterministic linear programming and linear programming including risk for the Target-Motad model, where two situations had been analyzed. In the deterministic model the area was the restrictive factor and the water was not restrictive for none of the tested situations. For the first situation the gotten maximum income was of R$ 1,883,372.87 and for the second situation it was of R$ 1,821,772.40. In the model including risk a producer that accepts risk can in the first situation get the maximum income of R$ 1,883,372. 87 with a minimum risk of R$ 350 year(-1), and in the second situation R$ 1,821,772.40 with a minimum risk of R$ 40 year(-1). Already a producer averse to the risk can get in the first situation a maximum income of R$ 1,775,974.81 with null risk and for the second situation R$ 1.707.706, 26 with null risk, both without water restriction. These results stand out the importance of the inclusion of the risk in supplying alternative occupations to the producer, allowing to a producer taking of decision considered the risk aversion and the pretension of income.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The aim of this study was to identify novel candidate biomarker proteins differentially expressed in the plasma of patients with early stage acute myocardial infarction (AMI) using SELDI-TOF-MS as a high throughput screening technology. Methods: Ten individuals with recent acute ischemic-type chest pain (< 12 h duration) and ST-segment elevation AMI (1STEMI) and after a second AMI (2STEMI) were selected. Blood samples were drawn at six times after STEMI diagnosis. The first stage (T(0)) was in Emergency Unit before receiving any medication, the second was just after primary angioplasty (T(2)), and the next four stages occurred at 12 h intervals after T(0). Individuals (n = 7) with similar risk factors for cardiovascular disease and normal ergometric test were selected as a control group (CG). Plasma proteomic profiling analysis was performed using the top-down (i.e. intact proteins) SELDI-TOF-MS, after processing in a Multiple Affinity Removal Spin Cartridge System (Agilent). Results: Compared with the CG, the 1STEMI group exhibited 510 differentially expressed protein peaks in the first 48 h after the AMI (p < 0.05). The 2STEMI group, had similar to 85% fewer differently expressed protein peaks than those without previous history of AMI (76, p < 0.05). Among the 16 differentially-regulated protein peaks common to both STEMI cohorts (compared with the CG at T(0)), 6 peaks were persistently down-regulated at more than one time-stage, and also were inversed correlated with serum protein markers (cTnI, CK and CKMB) during 48 h-period after IAM. Conclusions: Proteomic analysis by SELDI-TOF-MS technology combined with bioinformatics tools demonstrated differential expression during a 48 h time course suggests a potential role of some of these proteins as biomarkers for the very early stages of AMI, as well as for monitoring early cardiac ischemic recovery. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

These notes follow on from the material that you studied in CSSE1000 Introduction to Computer Systems. There you studied details of logic gates, binary numbers and instruction set architectures using the Atmel AVR microcontroller family as an example. In your present course (METR2800 Team Project I), you need to get on to designing and building an application which will include such a microcontroller. These notes focus on programming an AVR microcontroller in C and provide a number of example programs to illustrate the use of some of the AVR peripheral devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal-based theories of Pavlovian conditioning propose that patterning discriminations are solved using unique cues or immediate configuring. Recent studies with humans, however, provided evidence that in positive and negative patterning two different rules are utilized. The present experiment was designed to provide further support for this proposal by tracking the time course of the allocation of cognitive resources. One group was trained in a positive patterning; schedule (A-, B-, AB+) and a second in a negative patterning schedule (A+, B+, AB-). Electrodermal responses and secondary task probe reaction time were measured. In negative patterning, reaction times were slower during reinforced stimuli than during non-reinforced stimuli at both probe positions while there were no differences in positive patterning. These results support the assumption that negative patterning is solved using a rule that is more complex and requires more resources than does the rule employed to solve positive patterning. (C) 2001 Elsevier Science (USA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To map out the career paths of veterinarians during their first 10 years after graduation, and to determine if this could have been predicted at entry to the veterinary course. Design Longitudinal study of students who started their course at The University of Queensland in 1985 and 1986, and who completed questionnaires in their first and fifth year as students, and in their second, sixth and eleventh year as veterinarians. Methods Data from 129 (96%) questionnaires completed during the eleventh year after graduation were coded numerically then analysed, together with data from previous questionnaires, with SAS System 7 for Windows 95. Results Ten years after they graduated, 80% were doing veterinary work, 60% were in private practice, 40% in small animal practice and 18% in mixed practice. The equivalent of 25% of the working time of all females was taken up by family duties. When part-time work was taken into account, veterinary work constituted the equivalent of 66% of the group working full-time. That 66% consisted of 52% on small animals, 7% on horses, 6% on cattle/sheep and 1% on pigs/poultry. Those who had grown up on farms with animals were twice as likely to be working with farm animals as were those from other backgrounds. Forecasts made on entry to the veterinary course were of no value in predicting who would remain in mixed practice. Conclusions Fewer than one-fifth of graduates were in mixed practice after 10 years, but the number was higher for those who grew up on farms with animals. Forecasts that may be made at interview before entry to the course were of little value in predicting the likelihood of remaining in mixed veterinary practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several projects in the recent past have aimed at promoting Wireless Sensor Networks as an infrastructure technology, where several independent users can submit applications that execute concurrently across the network. Concurrent multiple applications cause significant energy-usage overhead on sensor nodes, that cannot be eliminated by traditional schemes optimized for single-application scenarios. In this paper, we outline two main optimization techniques for reducing power consumption across applications. First, we describe a compiler based approach that identifies redundant sensing requests across applications and eliminates those. Second, we cluster the radio transmissions together by concatenating packets from independent applications based on Rate-Harmonized Scheduling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vishnu is a tool for XSLT visual programming in Eclipse - a popular and extensible integrated development environment. Rather than writing the XSLT transformations, the programmer loads or edits two document instances, a source document and its corresponding target document, and pairs texts between then by drawing lines over the documents. This form of XSLT programming is intended for simple transformations between related document types, such as HTML formatting or conversion among similar formats. Complex XSLT programs involving, for instance, recursive templates or second order transformations are out of the scope of Vishnu. We present the architecture of Vishnu composed by a graphical editor and a programming engine. The editor is an Eclipse plug-in where the programmer loads and edits document examples and pairs their content using graphical primitives. The programming engine receives the data collected by the editor and produces an XSLT program. The design of the engine and the process of creation of an XSLT program from examples are also detailed. It starts with the generation of an initial transformation that maps source document to the target document. This transformation is fed to a rewrite process where each step produces a refined version of the transformation. Finally, the transformation is simplified before being presented to the programmer for further editing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Massive Open Online Courses (MOOC) are gaining prominence in transversal teaching-learning strategies. However, there are many issues still debated, namely assessment, recognized largely as a cornerstone in Education. The large number of students involved requires a redefinition of strategies that often use approaches based on tasks or challenging projects. In these conditions and due to this approach, assessment is made through peer-reviewed assignments and quizzes online. The peer-reviewed assignments are often based upon sample answers or topics, which guide the student in the task of evaluating peers. This chapter analyzes the grading and evaluation in MOOCs, especially in science and engineering courses, within the context of education and grading methodologies and discusses possible perspectives to pursue grading quality in massive e-learning courses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de Estágio de Mestrado em Ciência Política e Relações Internacionais Globalização e Ambiente

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine ethics is an interdisciplinary field of inquiry that emerges from the need of imbuing autonomous agents with the capacity of moral decision-making. While some approaches provide implementations in Logic Programming (LP) systems, they have not exploited LP-based reasoning features that appear essential for moral reasoning. This PhD thesis aims at investigating further the appropriateness of LP, notably a combination of LP-based reasoning features, including techniques available in LP systems, to machine ethics. Moral facets, as studied in moral philosophy and psychology, that are amenable to computational modeling are identified, and mapped to appropriate LP concepts for representing and reasoning about them. The main contributions of the thesis are twofold. First, novel approaches are proposed for employing tabling in contextual abduction and updating – individually and combined – plus a LP approach of counterfactual reasoning; the latter being implemented on top of the aforementioned combined abduction and updating technique with tabling. They are all important to model various issues of the aforementioned moral facets. Second, a variety of LP-based reasoning features are applied to model the identified moral facets, through moral examples taken off-the-shelf from the morality literature. These applications include: (1) Modeling moral permissibility according to the Doctrines of Double Effect (DDE) and Triple Effect (DTE), demonstrating deontological and utilitarian judgments via integrity constraints (in abduction) and preferences over abductive scenarios; (2) Modeling moral reasoning under uncertainty of actions, via abduction and probabilistic LP; (3) Modeling moral updating (that allows other – possibly overriding – moral rules to be adopted by an agent, on top of those it currently follows) via the integration of tabling in contextual abduction and updating; and (4) Modeling moral permissibility and its justification via counterfactuals, where counterfactuals are used for formulating DDE.