914 resultados para software as teaching tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the ideas and problems of the Edukalibre e-learning project, in which the author takes part. The basic objective of the project shares the development and exploitation of software components for web-based information systems applied to education as well as organizing of teaching material for them. The paper concerns a problem of the mathematical-oriented courseware and describes the experience in developing LaTeX-supporting online converting tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Forschungsarbeit siedelt sich im Dreieck der Erziehungswissenschaften, der Informatik und der Schulpraxis an und besitzt somit einen starken interdisziplinären Charakter. Aus Sicht der Erziehungswissenschaften handelt es sich um ein Forschungsprojekt aus den Bereichen E-Learning und Multimedia Learning und der Fragestellung nach geeigneten Informatiksystemen für die Herstellung und den Austausch von digitalen, multimedialen und interaktiven Lernbausteinen. Dazu wurden zunächst methodisch-didaktische Vorteile digitaler Lerninhalte gegenüber klassischen Medien wie Buch und Papier zusammengetragen und mögliche Potentiale im Zusammenhang mit neuen Web 2.0-Technologien aufgezeigt. Darauf aufbauend wurde für existierende Autorenwerkzeuge zur Herstellung digitaler Lernbausteine und bestehende Austauschplattformen analysiert, inwieweit diese bereits Web 2.0-Technologien unterstützen und nutzen. Aus Sicht der Informatik ergab sich aus der Analyse bestehender Systeme ein Anforderungsprofil für ein neues Autorenwerkzeug und eine neue Austauschplattform für digitale Lernbausteine. Das neue System wurde nach dem Ansatz des Design Science Research in einem iterativen Entwicklungsprozess in Form der Webapplikation LearningApps.org realisiert und stetig mit Lehrpersonen aus der Schulpraxis evaluiert. Bei der Entwicklung kamen aktuelle Web-Technologien zur Anwendung. Das Ergebnis der Forschungsarbeit ist ein produktives Informatiksystem, welches bereits von tausenden Nutzern in verschiedenen Ländern sowohl in Schulen als auch in der Wirtschaft eingesetzt wird. In einer empirischen Studie konnte das mit der Systementwicklung angestrebte Ziel, die Herstellung und den Austausch von digitalen Lernbausteinen zu vereinfachen, bestätigt werden. Aus Sicht der Schulpraxis liefert LearningApps.org einen Beitrag zur Methodenvielfalt und zur Nutzung von ICT im Unterricht. Die Ausrichtung des Werkzeugs auf mobile Endgeräte und 1:1-Computing entspricht dem allgemeinen Trend im Bildungswesen. Durch die Verknüpfung des Werkzeugs mit aktuellen Software-Entwicklungen zur Herstellung von digitalen Schulbüchern werden auch Lehrmittelverlage als Zielgruppe angesprochen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teaching is a dynamic activity. It can be very effective, if its impact is constantly monitored and adjusted to the demands of changing social contexts and needs of learners. This implies that teachers need to be aware about teaching and learning processes. Moreover, they should constantly question their didactical methods and the learning resources, which they provide to their students. They should reflect if their actions are suitable, and they should regulate their teaching, e.g., by updating learning materials based on new knowledge about learners, or by motivating learners to engage in further learning activities. In the last years, a rising interest in ‘learning analytics’ is observable. This interest is motivated by the availability of massive amounts of educational data. Also, the continuously increasing processing power, and a strong motivation for discovering new information from these pools of educational data, is pushing further developments within the learning analytics research field. Learning analytics could be a method for reflective teaching practice that enables and guides teachers to investigate and evaluate their work in future learning scenarios. However, this potentially positive impact has not yet been sufficiently verified by learning analytics research. Another method that pursues these goals is ‘action research’. Learning analytics promises to initiate action research processes because it facilitates awareness, reflection and regulation of teaching activities analogous to action research. Therefore, this thesis joins both concepts, in order to improve the design of learning analytics tools. Central research question of this thesis are: What are the dimensions of learning analytics in relation to action research, which need to be considered when designing a learning analytics tool? How does a learning analytics dashboard impact the teachers of technology-enhanced university lectures regarding ‘awareness’, ‘reflection’ and ‘action’? Does it initiate action research? Which are central requirements for a learning analytics tool, which pursues such effects? This project followed design-based research principles, in order to answer these research questions. The main contributions are: a theoretical reference model that connects action research and learning analytics, the conceptualization and implementation of a learning analytics tool, a requirements catalogue for useful and usable learning analytics design based on evaluations, a tested procedure for impact analysis, and guidelines for the introduction of learning analytics into higher education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: To evaluate the accuracy and reproducibility of aortic annulus sizing using a multislice computed tomography (MSCT) based aortic root reconstruction tool compared with conventional imaging among patients evaluated for transcatheter aortic valve replacement (TAVR). Methods and results: Patients referred for TAVR underwent standard preprocedural assessment of aortic annulus parameters using MSCT, angiography and transoesophageal echocardiography (TEE). Three-dimensional (3-D) reconstruction of MSCT images of the aortic root was performed using 3mensio (3mensio Medical Imaging BV, Bilthoven, The Netherlands), allowing for semi-automated delineation of the annular plane and assessment of annulus perimeter, area, maximum, minimum and virtual diameters derived from area and perimeter (aVD and pVD). A total of 177 patients were enrolled. We observed a good inter-observer variability of 3-D reconstruction assessments with concordance coefficients for agreement of 0.91 (95% CI: 0.87-0.93) and 0.91 (0.88-0.94) for annulus perimeter and area assessments, respectively. 3-D derived pVD and aVD correlated very closely with a concordance coefficient of 0.97 (0.96-0.98) with a mean difference of 0.5±0.3 mm (pVD-aVD). 3-D derived pVD showed the best, but moderate concordance with diameters obtained from coronal MSCT (0.67, 0.56-0.75; 0.3±1.8 mm), and the lowest concordance with diameters obtained from TEE (0.42, 0.31-0.52; 1.9±1.9 mm). Conclusions: MSCT-based 3-D reconstruction of the aortic annulus using the 3mensio software enables accurate and reproducible assessment of aortic annulus dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the long run, the widespread use of slide scanners by pathologists requires an adaptation of teaching methods in histology and cytology in order to target these new possibilities of image processing and presentation via the internet. Accordingly, we were looking for a tool with the possibility to teach microscopic anatomy, histology, and cytology of tissue samples which would be able to combine image data from light and electron microscopes independently of microscope suppliers. With the example of a section through the villus of jejunum, we describe here how to process image data from light and electron microscopes in order to get one image-stack which allows a correlation of structures from the microscopic anatomic to the cytological level. With commercially available image-presentation software that we adapted to our needs, we present here a platform which allows for the presentation of this new but also of older material independently of microscope suppliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both TBL and PBL attempt to maximally engage the learner and both are designed to encourage interactive teaching / learning. PBL is student centered. TBL, in contrast, is typically instructor centered. The PBL Executive Committee of the UTHSC-Houston Medical School, in an attempt to capture the pedagogical advantages of PBL and of TBL, implemented a unique PBL experience into the ICE/PBL course during the final block of PBL instruction in year 2. PBL cases provided the content knowledge for focused learning. The subsequent, related TBL exercises fostered integration / critical thinking about each of these cases. [See PDF for complete abstract]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Atrial fibrillation (AF) is common and may have severe consequences. Continuous long-term electrocardiogram (ECG) is widely used for AF screening. Recently, commercial ECG analysis software was launched, which automatically detects AF in long-term ECGs. It has been claimed that such tools offer reliable AF screening and save time for ECG analysis. However, this has not been investigated in a real-life patient cohort. Objective To investigate the performance of automatic software-based screening for AF in long-term ECGs. Methods Two independent physicians manually screened 22,601 hours of continuous long-term ECGs from 150 patients for AF. Presence, number, and duration of AF episodes were registered. Subsequently, the recordings were screened for AF by an established ECG analysis software (Pathfinder SL), and its performance was validated against the thorough manual analysis (gold standard). Results Sensitivity and specificity for AF detection was 98.5% (95% confidence interval 91.72%–99.96%) and 80.21% (95% confidence interval 70.83%–87.64%), respectively. Software-based AF detection was inferior to manual analysis by physicians (P < .0001). Median AF duration was underestimated (19.4 hours vs 22.1 hours; P < .001) and median number of AF episodes was overestimated (32 episodes vs 2 episodes; P < .001) by the software. In comparison to extensive quantitative manual ECG analysis, software-based analysis saved time (2 minutes vs 19 minutes; P < .001). Conclusion Owing to its high sensitivity and ability to save time, software-based ECG analysis may be used as a screening tool for AF. An additional manual confirmatory analysis may be required to reduce the number of false-positive findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software architecture is the result of a design effort aimed at ensuring a certain set of quality attributes. As we show, quality requirements are commonly specified in practice but are rarely validated using automated techniques. In this paper we analyze and classify commonly specified quality requirements after interviewing professionals and running a survey. We report on tools used to validate those requirements and comment on the obstacles encountered by practitioners when performing such activity (e.g., insufficient tool-support; poor understanding of users needs). Finally we discuss opportunities for increasing the adoption of automated tools based on the information we collected during our study (e.g., using a business-readable notation for expressing quality requirements; increasing awareness by monitoring non-functional aspects of a system).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software corpora facilitate reproducibility of analyses, however, static analysis for an entire corpus still requires considerable effort, often duplicated unnecessarily by multiple users. Moreover, most corpora are designed for single languages increasing the effort for cross-language analysis. To address these aspects we propose Pangea, an infrastructure allowing fast development of static analyses on multi-language corpora. Pangea uses language-independent meta-models stored as object model snapshots that can be directly loaded into memory and queried without any parsing overhead. To reduce the effort of performing static analyses, Pangea provides out-of-the box support for: creating and refining analyses in a dedicated environment, deploying an analysis on an entire corpus, using a runner that supports parallel execution, and exporting results in various formats. In this tool demonstration we introduce Pangea and provide several usage scenarios that illustrate how it reduces the cost of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of an investigation into the nature of information needs of software developers who work in projects that are part of larger ecosystems. This work is based on a quantitative survey of 75 professional software developers. We corroborate the results identified in the sur- vey with needs and motivations proposed in a previous sur- vey and discover that tool support for developers working in an ecosystem context is even more meager than we thought: mailing lists and internet search are the most popular tools developers use to satisfy their ecosystem-related information needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude spaceborne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing missionwide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExOPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND In 2007, a first survey on undergraduate palliative care teaching in Switzerland has revealed major heterogeneity of palliative care content, allocation of hours and distribution throughout the 6 year curriculum in Swiss medical faculties. This second survey in 2012/13 has been initiated as part of the current Swiss national strategy in palliative care (2010 - 2015) to serve as a longitudinal monitoring instrument and as a basis for redefinition of palliative care learning objectives and curriculum planning in our country. METHODS As in 2007, a questionnaire was sent to the deans of all five medical faculties in Switzerland in 2012. It consisted of eight sections: basic background information, current content and hours in dedicated palliative care blocks, current palliative care content in other courses, topics related to palliative care presented in other courses, recent attempts at improving palliative care content, palliative care content in examinations, challenges, and overall summary. Content analysis was performed and the results matched with recommendations from the EAPC for undergraduate training in palliative medicine as well as with recommendations from overseas countries. RESULTS There is a considerable increase in palliative care content, academic teaching staff and hours in all medical faculties compared to 2007. No Swiss medical faculty reaches the range of 40 h dedicated specifically to palliative care as recommended by the EAPC. Topics, teaching methods, distribution throughout different years and compulsory attendance still differ widely. Based on these results, the official Swiss Catalogue of Learning Objectives (SCLO) was complemented with 12 new learning objectives for palliative and end of life care (2013), and a national basic script for palliative care was published (2015). CONCLUSION Performing periodic surveys of palliative care teaching at national medical faculties has proven to be a useful tool to adapt the national teaching framework and to improve the recognition of palliative medicine as an integral part of medical training.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES The aims of the study were to use cone beam computed tomography (CBCT) images of nasopalatine duct cysts (NPDC) and to calculate the diameter, surface area, and 3D-volume using a custom-made software program. Furthermore, any associations of dimensions of NPDC with age, gender, presence/absence of maxillary incisors/canines (MI/MC), endodontic treatment of MI/MC, presenting symptoms, and postoperative complications were evaluated. MATERIAL AND METHODS The study comprised 40 patients with a histopathologically confirmed NPDC. On preoperative CBCT scans, curves delineating the cystic borders were drawn in all planes and the widest diameter (in millimeter), surface area (in square millimeter), and volume (in cubic millimeter) were calculated. RESULTS The overall mean cyst diameter was 15 mm (range 7-47 mm), the mean cyst surface area 566 mm(2) (84-4,516 mm(2)), and the mean cyst volume 1,735 mm(3) (65-25,350 mm(3)). For 22 randomly allocated cases, a second measurement resulted in a mean absolute aberration of ±4.2 % for the volume, ±2.8 % for the surface, and ±4.9 % for the diameter. A statistically significant association was found for the CBCT determined cyst measurements and the need for preoperative endodontic treatment to MI/MC and for postoperative complications. CONCLUSION In the hands of a single experienced operator, the novel software exhibited high repeatability for measurements of cyst dimensions. Further studies are needed to assess the application of this tool for dimensional analysis of different jaw cysts and lesions including treatment planning. CLINICAL RELEVANCE Accurate radiographic information of the bone volume lost (osteolysis) due to expansion of a cystic lesion in three dimensions could help in personalized treatment planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines a qualitative research tool designed to explore personal identity formation as described by Erik Erikson and offers self-reflective and anonymous evaluative comments made by college students after completing this task. Subjects compiled a list of 200 myths, customs, fables, rituals, and beliefs from their family of origin and then reflected upon the relevance and meaning of such items. The research and instructional tool described in the paper should be of considerable interest to teachers who work to promote self-reflection amongst adolescents as well as case study researchers and therapists who wish to study identity formation and values.