914 resultados para software as teaching tool
Resumo:
The article introduces the E-learning Circle, a tool developed to assure the quality of the software design process of e-learning systems, considering pedagogical principles as well as technology. The E-learning Circle consists of a number of concentric circles which are divided into three sectors. The content of the inner circles is based on pedagogical principles, while the outer circle specifies how the pedagogical principles may be implemented with technology. The circle’s centre is dedicated to the subject taught, ensuring focus on the specific subject’s properties. The three sectors represent the student, the teacher and the learning objectives. The strengths of the E-learning Circle are the compact presentation combined with the overview it provides, as well as the usefulness of a design tool dealing with complexity, providing a common language and embedding best practice. The E-learning Circle is not a prescriptive method, but is useful in several design models and processes. The article presents two projects where the E-learning Circle was used as a design tool.
Resumo:
The use of virtual learning environments in Higher Education (HE) has been growing in Portugal, driven by the Bologna Process. An example is the use of Learning Management Systems (LMS) that translates an opportunity to leverage the use of technological advances in the educational process. The progress of information and communication technologies (ICT) coupled with the great development of Internet has brought significant challenges to educators that require a thorough knowledge of their implementation process. These field notes present the results of a survey among teachers of a private HE institution in its use of Moodle as a tool to support face-to-face teaching. A research methodology essentially of exploratory nature based on a questionnaire survey, supported by statistical treatment allowed to detect motivations, type of use and perceptions of teachers in relation to this kind of tool. The results showed that most teachers, by a narrow margin (58%), had not changed their pedagogical practice as a consequence of using Moodle. Among those that did 67% attended institutional internal training. Some of the results obtained suggest further investigation and provide guidelines to plan future internal training.
Resumo:
During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network).
Resumo:
Oligonucleotides comprising unnatural building blocks, which interfere with the translation machinery, have gained increased attention for the treatment of gene-related diseases (e.g. antisense, RNAi). Due to structural modifications, synthetic oligonucleotides exhibit increased biostability and bioavailability upon administration. Consequently, classical enzyme-based sequencing methods are not applicable to their sequence elucidation and verification. Tandem mass spectrometry is the method of choice for performing such tasks, since gas-phase dissociation is not restricted to natural nucleic acids. However, tandem mass spectrometric analysis can generate product ion spectra of tremendous complexity, as the number of possible fragments grows rapidly with increasing sequence length. The fact that structural modifications affect the dissociation pathways greatly increases the variety of analytically valuable fragment ions. The gas-phase dissociation of oligonucleotides is characterized by the cleavage of one of the four bonds along the phosphodiester chain, by the accompanying loss of nucleases, and by the generation of internal fragments due to secondary backbone cleavage. For example, an 18-mer oligonucleotide yields a total number of 272’920 theoretical fragment ions. In contrast to the processing of peptide product ion spectra, which nowadays is highly automated, there is a lack of tools assisting the interpretation of oligonucleotide data. The existing web-based and stand-alone software applications are primarily designed for the sequence analysis of natural nucleic acids, but do not account for chemical modifications and adducts. Consequently, we developed a software to support the interpretation of mass spectrometric data of natural and modified nucleic acids and their adducts with chemotherapeutic agents.
Resumo:
BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Resumo:
Fil: Porto, Melina. Universidad Nacional de La Plata. Facultad de Humanidades y Ciencias de la Educación; Argentina.
Resumo:
Fil: Porto, Melina. Universidad Nacional de La Plata. Facultad de Humanidades y Ciencias de la Educación; Argentina.
Resumo:
This article shows software that allows determining the statistical behavior of qualitative data originating surveys previously transformed with a Likert’s scale to quantitative data. The main intention is offer to users a useful tool to know statistics' characteristics and forecasts of financial risks in a fast and simple way. Additionally,this paper presents the definition of operational risk. On the other hand, the article explains different techniques to do surveys with a Likert’s scale (Avila, 2008) to know expert’s opinion with the transformation of qualitative data to quantitative data. In addition, this paper will show how is very easy to distinguish an expert’s opinion related to risk, but when users have a lot of surveys and matrices is very difficult to obtain results because is necessary to compare common data. On the other hand, statistical value representative must be extracted from common data to get weight of each risk. In the end, this article exposes the development of “Qualitative Operational Risk Software” or QORS by its acronym, which has been designed to determine the root of risks in organizations and its value at operational risk OpVaR (Jorion, 2008; Chernobai et al, 2008) when input data comes from expert’s opinion and their associated matrices.
Resumo:
New concepts in air navigation have been introduced recently. Among others, are the concepts of trajectory optimization, 4D trajectories, RBT (Reference Business Trajectory), TBO (trajectory based operations), CDA (Continuous Descent Approach) and ACDA (Advanced CDA), conflict resolution, arrival time (AMAN), introduction of new aircraft (UAVs, UASs) in air space, etc. Although some of these concepts are new, the future Air Traffic Management will maintain the four ATM key performance areas such as Safety, Capacity, Efficiency, and Environmental impact. So much, the performance of the ATM system is directly related to the accuracy with which the future evolution of the traffic can be predicted. In this sense, future air traffic management will require a variety of support tools to provide suitable help to users and engineers involved in the air space management. Most of these tools are based on an appropriate trajectory prediction module as main component. Therefore, the purposes of these tools are related with testing and evaluation of any air navigation concept before they become fully operative. The aim of this paper is to provide an overview to the design of a software tool useful to estimate aircraft trajectories adapted to air navigation concepts. Other usage of the tool, like controller design, vertical navigation assessment, procedures validation and hardware and software in the loop are available in the software tool. The paper will show the process followed to design the tool, the software modules needed to perform accurately and the process followed to validate the output data.
Resumo:
The increasing use of video editing software has resulted in a necessity for faster and more efficient editing tools. Here, we propose a lightweight high-quality video indexing tool that is suitable for video editing software.
Resumo:
The increasing use of video editing software requires faster and more efficient editing tools. As a first step, these tools perform a temporal segmentation in shots that allows a later building of indexes describing the video content. Here, we propose a novel real-time high-quality shot detection strategy, suitable for the last generation of video editing software requiring both low computational cost and high quality results. While abrupt transitions are detected through a very fast pixel-based analysis, gradual transitions are obtained from an efficient edge-based analysis. Both analyses are reinforced with a motion analysis that helps to detect and discard false detections. This motion analysis is carried out exclusively over a reduced set of candidate transitions, thus maintaining the computational requirements demanded by new applications to fulfill user needs.
Resumo:
The aim of the paper is to discuss the use of knowledge models to formulate general applications. First, the paper presents the recent evolution of the software field where increasing attention is paid to conceptual modeling. Then, the current state of knowledge modeling techniques is described where increased reliability is available through the modern knowledge acquisition techniques and supporting tools. The KSM (Knowledge Structure Manager) tool is described next. First, the concept of knowledge area is introduced as a building block where methods to perform a collection of tasks are included together with the bodies of knowledge providing the basic methods to perform the basic tasks. Then, the CONCEL language to define vocabularies of domains and the LINK language for methods formulation are introduced. Finally, the object oriented implementation of a knowledge area is described and a general methodology for application design and maintenance supported by KSM is proposed. To illustrate the concepts and methods, an example of system for intelligent traffic management in a road network is described. This example is followed by a proposal of generalization for reuse of the resulting architecture. Finally, some concluding comments are proposed about the feasibility of using the knowledge modeling tools and methods for general application design.
Resumo:
Un plan para organizar las enseñanzas de la ingeniería del software en las titulaciones de informática de la URJC. Nowadays both industry and academic environments are showing a lot of interest in the Software Engineering discipline. Therefore, it is a challenge for universities to provide students with appropriate training in this area, preparing them for their future professional practice. There are many difficulties to provide that training. The outstanding ones are: the Software Engineering area is too broad and class hours are scarce; the discipline requires a high level of abstraction; it is difficult to reproduce real world situations in the classroom to provide a practical learning environment; the number of students per professor is very high (at least in Spain); companies develop software with a maturity level rarely over level 2 of the CMM for Software (again, at least in Spain) as opposed to what is taught at the University. Besides, there are different levels and study plans, making more difficult to structure the contents to teach in each term and degree. In this paper we present a plan for teaching Software Engineering trying to overcome some of the difficulties above.