824 resultados para Computer-based assessment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes. 
Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT). Participants were randomised to the intervention (computer-program) or control group (usual care). Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ)) measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD) years). After completion there was a significant between-group difference in the “knowledge and beliefs scale” of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand. 
Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of new learning models has been of great importance throughout recent years, with a focus on creating advances in the area of deep learning. Deep learning was first noted in 2006, and has since become a major area of research in a number of disciplines. This paper will delve into the area of deep learning to present its current limitations and provide a new idea for a fully integrated deep and dynamic probabilistic system. The new model will be applicable to a vast number of areas initially focusing on applications into medical image analysis with an overall goal of utilising this approach for prediction purposes in computer based medical systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer-based simulation games (CSG) are a form of innovation in learning and teaching. CGS are used more pervasively in various ways such as a class activity (formative exercises) and as part of summative assessments (Leemkuil and De Jong, 2012; Zantow et al., 2005). This study investigates the current and potential use of CGS in Worcester Business School’s (WBS) Business Management undergraduate programmes. The initial survey of off-the-shelf simulation reveals that there are various categories of simulations, with each offering varying levels of complexity and learning opportunities depending on the field of study. The findings suggest that whilst there is marginal adoption of the use CSG in learning and teaching, there is significant opportunity to increase the use of CSG in enhancing learning and learner achievement, especially in Level 5 modules. The use of CSG is situational and its adoption should be undertaken on a case-by-case basis. WBS can play a major role by creating an environment that encourages and supports the use of CSG as well as other forms of innovative learning and teaching methods. Thus the key recommendation involves providing module teams further support in embedding and integrating CSG into their modules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The education of the radiography profession is based within higher education establishments, yet a critical part of all radiography programmes is the clinical component where students learn the practical skills of the profession. Assessments therefore not only have to assess a student’s knowledge, but also their clinical competence and core skills in line with both Health and Care Professions Council and the Society and College of Radiographers requirements. This timely thesis examines the possibility of using the Virtual Environment for RadioTherapy (VERT) as an assessment tool to evaluate a student’s competence so giving the advantage of a standard assessment and relieving time pressures in the clinical department. A mixed methods approach was taken which can be described as a Quantitative Qualitative design with the emphasis being on the Quantitative element; a so called QUAN  qual design. The quantitative evaluation compared two simulations, one in the virtual reality environment and another in the department using a real treatment machine. Students were asked to perform two electron setups in each simulation; the order being randomly decided and so the study would be described as a randomised cross-over design. Following this, qualitative data was collected in student focus groups to explore student perspectives in more depth. Findings indicated that the performance between the two simulators was significantly different, p < 0∙001; the virtual simulation scoring significantly lower than the hospital based simulation overall and in virtually all parameters being assessed. Thematic analysis of the qualitative data supported this finding and identified 4 main themes; equipment use, a lack of reality, learning opportunities and assessment of competence. One other sub-theme identified for reality was that of the environment and senses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Differentiation between normal solid (non-cystic) pineal glands and pineal pathologies on brain MRI is difficult. The aim of this study was to assess the size of the solid pineal gland in children (0-5 years) and compare the findings with published pineoblastoma cases. METHODS: We retrospectively analyzed the size (width, height, planimetric area) of solid pineal glands in 184 non-retinoblastoma patients (73 female, 111 male) aged 0-5 years on MRI. The effect of age and gender on gland size was evaluated. Linear regression analysis was performed to analyze the relation between size and age. Ninety-nine percent prediction intervals around the mean were added to construct a normal size range per age, with the upper bound of the predictive interval as the parameter of interest as a cutoff for normalcy. RESULTS: There was no significant interaction of gender and age for all the three pineal gland parameters (width, height, and area). Linear regression analysis gave 99 % upper prediction bounds of 7.9, 4.8, and 25.4 mm(2), respectively, for width, height, and area. The slopes (size increase per month) of each parameter were 0.046, 0.023, and 0.202, respectively. Ninety-three percent (95 % CI 66-100 %) of asymptomatic solid pineoblastomas were larger in size than the 99 % upper bound. CONCLUSION: This study establishes norms for solid pineal gland size in non-retinoblastoma children aged 0-5 years. Knowledge of the size of the normal pineal gland is helpful for detection of pineal gland abnormalities, particularly pineoblastoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tesis (Licenciado en Lenguas Castellana, Inglés y Francés).--Universidad de La Salle. Facultad de Ciencias de La Educación. Licenciatura en Lengua Castellana, Inglés y Francés, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability and responsible environmental behaviour constitute a vital premise in the development of the humankind. In fact, during last decades, the global energetic scenario is evolving towards a scheme with increasing relevance of Renewable Energy Sources (RES) like photovoltaic, wind, biomass and hydrogen. Furthermore, hydrogen is an energy carrier which constitutes a mean for long-term energy storage. The integration of hydrogen with local RES contributes to distributed power generation and early introduction of hydrogen economy. Intermittent nature of many of RES, for instance solar and wind sources, impose the development of a management and control strategy to overcome this drawback. This strategy is responsible of providing a reliable, stable and efficient operation of the system. To implement such strategy, a monitoring system is required.The present paper aims to contribute to experimentally validate LabVIEW as valuable tool to develop monitoring platforms in the field of RES-based facilities. To this aim, a set of real systems successfully monitored is exposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evaluation of school students has always attempted to estimate the effort, abilities and learning of students. However, at the beginning, evaluation consisted in measuring the progress of the student’s behavior compared to a desirable behavior. Later on, different changes in evaluation benefited the processes addressed to assessing the academic achievement of students and learning itself. Today, the demands of the contemporary society are vast and numerous: students not only require knowledge, they need to develop skills, values and postures. Postmodern education requires individuals to develop different talents and competencies to grow in every way. Therefore, evaluation should respond to such needs promoting an ethical, technical, reliable assessment of the student’s competencies, thus providing more fair and objective, qualitative and quantitative judgments. This dissertation project is the result of a literature review of several authors and the daily work of teachers in the Centros de Educación Media Superior a Distancia [High School Distance Centers] of Morelos, Mexico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research project aims to improve the Design for Additive Manufacturing of metal components. Firstly, the scenario of Additive Manufacturing is depicted, describing its role in Industry 4.0 and in particular focusing on Metal Additive Manufacturing technologies and the Automotive sector applications. Secondly, the state of the art in Design for Additive Manufacturing is described, contextualizing the methodologies, and classifying guidelines, rules, and approaches. The key phases of product design and process design to achieve lightweight functional designs and reliable processes are deepened together with the Computer-Aided Technologies to support the approaches implementation. Therefore, a general Design for Additive Manufacturing workflow based on product and process optimization has been systematically defined. From the analysis of the state of the art, the use of a holistic approach has been considered fundamental and thus the use of integrated product-process design platforms has been evaluated as a key element for its development. Indeed, a computer-based methodology exploiting integrated tools and numerical simulations to drive the product and process optimization has been proposed. A validation of CAD platform-based approaches has been performed, as well as potentials offered by integrated tools have been evaluated. Concerning product optimization, systematic approaches to integrate topology optimization in the design have been proposed and validated through product optimization of an automotive case study. Concerning process optimization, the use of process simulation techniques to prevent manufacturing flaws related to the high thermal gradients of metal processes is developed, providing case studies to validate results compared to experimental data, and application to process optimization of an automotive case study. Finally, an example of the product and process design through the proposed simulation-driven integrated approach is provided to prove the method's suitability for effective redesigns of Additive Manufacturing based high-performance metal products. The results are then outlined, and further developments are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ao longo dos tempos foi possível constatar que uma grande parte do tempo dos professores é gasta na componente de avaliação. Por esse facto, há já algumas décadas que a correcção automática de texto livre é alvo de investigação. Sendo a correcção de exercícios efectuada pelo computador permite que o professor dedique o seu tempo em tarefas que melhorem a aprendizagem dos alunos. Para além disso, cada vez mais as novas tecnologias permitem o uso de ferramentas com bastante utilidade no ensino, pois para além de facilitarem a exposição do conhecimento também permitem uma maior retenção da informação. Logo, associar ferramentas de gestão de sala de aula à correcção automática de respostas de texto livre é um desafio bastante interessante. O objectivo desta dissertação foi a realização de um estudo relativamente à área de avaliação assistida por computador em que este trabalho se insere. Inicialmente, foram analisados alguns correctores ortográficos para seleccionar aquele que seria integrado no módulo proposto. De seguida, foram estudadas as técnicas mais relevantes e as ferramentas que mais se enquadram no âmbito deste trabalho. Neste contexto, a ideia foi partir da existência de uma ferramenta de gestão de sala de aula e desenvolver um módulo para a correcção de exercícios. A aplicação UNI_NET-Classroom, que foi a ferramenta para a qual o módulo foi desenvolvido, já continha um componente de gestão de exercícios que apenas efectuava a correcção para as respostas de escolha múltipla. Com este trabalho pretendeu-se acrescentar mais uma funcionalidade a esse componente, cujo intuito é dar apoio ao professor através da correcção de exercícios e sugestão da cotação a atribuir. Por último, foram realizadas várias experiências sobre o módulo desenvolvido, de forma a ser possível retirar algumas conclusões para o presente trabalho. A conclusão mais importante foi que as ferramentas de correcção automática são uma mais-valia para os professores e escolas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La enseñanza y evaluación automática a través de un sistema Computer Based Assessment (CBA) requiere de software especializado que se adapte a la tipología de actividades a tratar y evaluar. En esta tesis se ha desarrollado un entorno CBA que facilita el aprendizaje y evaluación de los principales temas de una asignatura de bases de datos. Para ello se han analizado las herramientas existentes en cada uno de estos temas (Diagramas Entidad/Relación, diagramas de clases, esquemas de bases de datos relacionales, normalización, consultas en álgebra relacional y lenguaje SQL) y para cada uno de ellos se ha analizado, diseñado e implementado un módulo de corrección y evaluación automática que aporta mejoras respecto a los existentes. Estos módulos se han integrado en un mismo entorno al que hemos llamado ACME-DB.