798 resultados para Computer-Based Training System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction A computer-based simulation game (CSG) was used for the first time in a final-year undergraduate module. A change management simulation game was used in the seminar classes as a formative exercise that was linked to parts of the students’ summative assessment. The module evaluation suggests that most students learned from using the CSG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer-based simulation games (CSG) are a form of innovation in learning and teaching. CGS are used more pervasively in various ways such as a class activity (formative exercises) and as part of summative assessments (Leemkuil and De Jong, 2012; Zantow et al., 2005). This study investigates the current and potential use of CGS in Worcester Business School’s (WBS) Business Management undergraduate programmes. The initial survey of off-the-shelf simulation reveals that there are various categories of simulations, with each offering varying levels of complexity and learning opportunities depending on the field of study. The findings suggest that whilst there is marginal adoption of the use CSG in learning and teaching, there is significant opportunity to increase the use of CSG in enhancing learning and learner achievement, especially in Level 5 modules. The use of CSG is situational and its adoption should be undertaken on a case-by-case basis. WBS can play a major role by creating an environment that encourages and supports the use of CSG as well as other forms of innovative learning and teaching methods. Thus the key recommendation involves providing module teams further support in embedding and integrating CSG into their modules.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Nützlichkeit des Einsatzes von Computern in Schule und Ausbildung ist schon seit einigen Jahren unbestritten. Uneinigkeit herrscht gegenwärtig allerdings darüber, welche Aufgaben von Computern eigenständig wahrgenommen werden können. Bewertet man die Übernahme von Lehrfunktionen durch computerbasierte Lehrsysteme, müssen häufig Mängel festgestellt werden. Das Ziel der vorliegenden Arbeit ist es, ausgehend von aktuellen Praxisrealisierungen computerbasierter Lehrsysteme unterschiedliche Klassen von zentralen Lehrkompetenzen (Schülermodellierung, Fachwissen und instruktionale Aktivitäten im engeren Sinne) zu bestimmen. Innerhalb jeder Klasse werden globale Leistungen der Lehrsysteme und notwendige, in komplementärer Relation stehende Tätigkeiten menschlicher Tutoren bestimmt. Das dabei entstandene Klassifikationsschema erlaubt sowohl die Einordnung typischer Lehrsysteme als auch die Feststellung von spezifischen Kompetenzen, die in der Lehrer- bzw. Trainerausbildung zukünftig vermehrt berücksichtigt werden sollten. (DIPF/Orig.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dados suplementares associados com este artigo disponíveis na versão online em: http://dx.doi.org/10.1016/j.marpol.2016.06.021

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Altough nowadays DMTA is one of the most used techniques to characterize polymers thermo-mechanical behaviour, it is only effective for small amplitude oscillatory tests and limited to a single frequency analysis (linear regime). In this thesis work a Fourier transform based experimental system has proven to give hint on structural and chemical changes in specimens during large amplitude oscillatory tests exploiting multi frequency spectral analysis turning out in a more sensitive tool than classical linear approach. The test campaign has been focused on three test typologies: Strain sweep tests, Damage investigation and temperature sweep tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In multi-unit organisations such as a bank and its branches or a national body delivering publicly funded health or education services through local operating units, the need arises to incentivize the units to operate efficiently. In such instances, it is generally accepted that units found to be inefficient can be encouraged to make efficiency savings. However, units which are found to be efficient need to be incentivized in a different manner. It has been suggested that efficient units could be incentivized by some reward compatible with the level to which their attainment exceeds that of the best of the rest, normally referred to as “super-efficiency”. A recent approach to this issue (Varmaz et. al. 2013) has used Data Envelopment Analysis (DEA) models to measure the super-efficiency of the whole system of operating units with and without the involvement of each unit in turn in order to provide incentives. We identify shortcomings in this approach and use it as a starting point to develop a new DEA-based system for incentivizing operating units to operate efficiently for the benefit of the aggregate system of units. Data from a small German retail bank is used to illustrate our method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recommender system is a specific type of intelligent systems, which exploits historical user ratings on items and/or auxiliary information to make recommendations on items to the users. It plays a critical role in a wide range of online shopping, e-commercial services and social networking applications. Collaborative filtering (CF) is the most popular approaches used for recommender systems, but it suffers from complete cold start (CCS) problem where no rating record are available and incomplete cold start (ICS) problem where only a small number of rating records are available for some new items or users in the system. In this paper, we propose two recommendation models to solve the CCS and ICS problems for new items, which are based on a framework of tightly coupled CF approach and deep learning neural network. A specific deep neural network SADE is used to extract the content features of the items. The state of the art CF model, timeSVD++, which models and utilizes temporal dynamics of user preferences and item features, is modified to take the content features into prediction of ratings for cold start items. Extensive experiments on a large Netflix rating dataset of movies are performed, which show that our proposed recommendation models largely outperform the baseline models for rating prediction of cold start items. The two proposed recommendation models are also evaluated and compared on ICS items, and a flexible scheme of model retraining and switching is proposed to deal with the transition of items from cold start to non-cold start status. The experiment results on Netflix movie recommendation show the tight coupling of CF approach and deep learning neural network is feasible and very effective for cold start item recommendation. The design is general and can be applied to many other recommender systems for online shopping and social networking applications. The solution of cold start item problem can largely improve user experience and trust of recommender systems, and effectively promote cold start items.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marine protected areas (MPAs) are a global conservation and management tool to enhance the resilience of linked social-ecological systems with the aim of conserving biodiversity and providing ecosystem services for sustainable use. However, MPAs implemented worldwide include a large variety of zoning and management schemes from single to multiple-zoning and from no-take to multiple-use areas. The current IUCN categorisation of MPAs is based on management objectives which many times have a significant mismatch to regulations causing a strong uncertainty when evaluating global MPAs effectiveness. A novel global classification system for MPAs based on regulations of uses as an alternative or complementing, the current IUCN system of categories is presented. Scores for uses weighted by their potential impact on biodiversity were built. Each zone within a MPA was scored and an MPA index integrates the zone scores. This system classifies MPAs as well as each MPA zone individually, is globally applicable and unambiguously discriminates the impacts of uses. (C) 2016 The Authors. Published by Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research project aims to improve the Design for Additive Manufacturing of metal components. Firstly, the scenario of Additive Manufacturing is depicted, describing its role in Industry 4.0 and in particular focusing on Metal Additive Manufacturing technologies and the Automotive sector applications. Secondly, the state of the art in Design for Additive Manufacturing is described, contextualizing the methodologies, and classifying guidelines, rules, and approaches. The key phases of product design and process design to achieve lightweight functional designs and reliable processes are deepened together with the Computer-Aided Technologies to support the approaches implementation. Therefore, a general Design for Additive Manufacturing workflow based on product and process optimization has been systematically defined. From the analysis of the state of the art, the use of a holistic approach has been considered fundamental and thus the use of integrated product-process design platforms has been evaluated as a key element for its development. Indeed, a computer-based methodology exploiting integrated tools and numerical simulations to drive the product and process optimization has been proposed. A validation of CAD platform-based approaches has been performed, as well as potentials offered by integrated tools have been evaluated. Concerning product optimization, systematic approaches to integrate topology optimization in the design have been proposed and validated through product optimization of an automotive case study. Concerning process optimization, the use of process simulation techniques to prevent manufacturing flaws related to the high thermal gradients of metal processes is developed, providing case studies to validate results compared to experimental data, and application to process optimization of an automotive case study. Finally, an example of the product and process design through the proposed simulation-driven integrated approach is provided to prove the method's suitability for effective redesigns of Additive Manufacturing based high-performance metal products. The results are then outlined, and further developments are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alguns jogos têm como objectivo a competição, outros a aprendizagem, uns jogam-se em grupo, outros individualmente. No entanto, todos têm um factor comum, ou seja, a experiência que se retira do momento é única. Seja esta experiência positiva ou negativa vai servir de aprendizagem nem que seja apenas das regras e mecânicas do dispositivo. Os Serious Games simulam situações ou processos do mundo real que são elaborados com o propósito de resolver um problema. Muitas vezes estes sacrificam o divertimento e o entretenimento com o objectivo de alcançar um tipo de progresso desejado para o jogador. Tal como no passado, e tendo em conta o desenvolvimento exponencial da tecnologia, os Serious Games podem agora ter um papel fundamental no desenvolvimento de novas terapias e ferramentas de saúde. É precisamente a olhar para o presente, e com os olhos no futuro dos Serious Games aplicados à saúde, que foi desenvolvida esta investigação. Como complemento, é também apresentado o projecto Typlife. Destinado a jovens com diabetes, é um projecto académico que tem como objectivo o desenvolvimento de uma aplicação para smartphone para o controlo da diabetes, enquanto envolve o utilizador numa experiência interactiva de recompensas pelas boas práticas no dia-a-dia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project described herein has led to a convenient, computer-based expert system for identifying and evaluating potentially effective erosion- and sedimentation-control measures for use in roadway construction throughout Iowa and elsewhere in the Midwest. The expert system is intended to be an accessible and efficient practical resource to aid state, county, and municipal engineers in the selection of the best management practices for preventing unwanted erosion and sedimentation at roadway construction sites, during and after construction.