825 resultados para digital learning tools
Resumo:
The methodology “b-learning” is a new teaching scenario and it requires the creation, adaptation and application of new learning tools searching the assimilation of new collaborative competences. In this context, it is well known the knowledge spirals, the situational leadership and the informal learning. The knowledge spirals is a basic concept of the knowledge procedure and they are based on that the knowledge increases when a cycle of 4 phases is repeated successively.1) The knowledge is created (for instance, to have an idea); 2) The knowledge is decoded into a format to be easily transmitted; 3) The knowledge is modified to be easily comprehensive and it is used; 4) New knowledge is created. This new knowledge improves the previous one (step 1). Each cycle shows a step of a spiral staircase: by going up the staircase, more knowledge is created. On the other hand, the situational leadership is based on that each person has a maturity degree to develop a specific task and this maturity increases with the experience. Therefore, the teacher (leader) has to adapt the teaching style to the student (subordinate) requirements and in this way, the professional and personal development of the student will increase quickly by improving the results and satisfaction. This educational strategy, finally combined with the informal learning, and in particular the zone of proximal development, and using a learning content management system own in our University, gets a successful and well-evaluated learning activity in Master subjects focused on the collaborative activity of preparation and oral exhibition of short and specific topics affine to these subjects. Therefore, the teacher has a relevant and consultant role of the selected topic and his function is to guide and supervise the work, incorporating many times the previous works done in other courses, as a research tutor or more experienced student. Then, in this work, we show the academic results, grade of interactivity developed in these collaborative tasks, statistics and the satisfaction grade shown by our post-graduate students.
Resumo:
Mental simulations and analogies have been identified as powerful learning tools for RNPs. Furthermore, visuals in advertising have recently been conceptualized as meaningful sources of information as opposed to peripheral cues and thus may help consumers learn about RNPs. The study of visual attention may also contribute to understanding the links between conceptual and perceptual analyses when learning for a RNP. Two conceptual models are developed. the first model consists of causal relationships between the attributes of advertising stimuli for RNPs and consumer responses, as well as mediating influences. The second model focuses on the role of visual attention in product comprehension as a response to advertising stimuli. Two experiments are conducted: a Web-Experiment and an eye-tracking experiment. The first experiment (858 subjects) examines the effect of learning strategies (mental simulation vs. analogy vs. no analogy/no mental simulation) and presentation formats (words vs. pictures) on individual responses. The mediating role of emotions is assessed. The second experiment investigates the effect of learning strategies and presentation formats on product comprehension, along with the role of attention (17 subjects). The findings from experiment 1 indicate that learning strategies and presentation formats can either enhance or undermine the effect of advertising stimuli on individual responses. Moreover, the nature of the product (i.e. hedonic vs. utilitarian vs. hybrid) should be considered when designing communications for RNPs. The mediating role of emotions is verified. Experiment 2 suggests that an increase in attention to the message may either reflect enhanced comprehension or confusion.
Resumo:
The primary aim of this dissertation is to develop data mining tools for knowledge discovery in biomedical data when multiple (homogeneous or heterogeneous) sources of data are available. The central hypothesis is that, when information from multiple sources of data are used appropriately and effectively, knowledge discovery can be better achieved than what is possible from only a single source. ^ Recent advances in high-throughput technology have enabled biomedical researchers to generate large volumes of diverse types of data on a genome-wide scale. These data include DNA sequences, gene expression measurements, and much more; they provide the motivation for building analysis tools to elucidate the modular organization of the cell. The challenges include efficiently and accurately extracting information from the multiple data sources; representing the information effectively, developing analytical tools, and interpreting the results in the context of the domain. ^ The first part considers the application of feature-level integration to design classifiers that discriminate between soil types. The machine learning tools, SVM and KNN, were used to successfully distinguish between several soil samples. ^ The second part considers clustering using multiple heterogeneous data sources. The resulting Multi-Source Clustering (MSC) algorithm was shown to have a better performance than clustering methods that use only a single data source or a simple feature-level integration of heterogeneous data sources. ^ The third part proposes a new approach to effectively incorporate incomplete data into clustering analysis. Adapted from K-means algorithm, the Generalized Constrained Clustering (GCC) algorithm makes use of incomplete data in the form of constraints to perform exploratory analysis. Novel approaches for extracting constraints were proposed. For sufficiently large constraint sets, the GCC algorithm outperformed the MSC algorithm. ^ The last part considers the problem of providing a theme-specific environment for mining multi-source biomedical data. The database called PlasmoTFBM, focusing on gene regulation of Plasmodium falciparum, contains diverse information and has a simple interface to allow biologists to explore the data. It provided a framework for comparing different analytical tools for predicting regulatory elements and for designing useful data mining tools. ^ The conclusion is that the experiments reported in this dissertation strongly support the central hypothesis.^
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
Right across Europe technology is playing a vital part in enhancing learning for an increasingly diverse population of learners. Learning is increasingly flexible, social and mobile and supported by high quality multi-media resources. Institutional VLEs are seeing a shift towards open source products and these core systems are supplemented by a range of social and collaborative learning tools based on web 2.0 technologies. Learners undertaking field studies and those in the workplace are coming to expect that these off-campus experiences will also be technology-rich whether supported by institutional or user-owned devices. As well as keeping European businesses competitive, learning is seen as a means of increasing social mobility and supporting an agenda of social justice. For a number of years the EUNIS E-Learning Task Force (ELTF) has conducted snapshot surveys of e-learning across member institutions, collected case studies of good practice in e-learning see (Hayes, et al., 2009) in references, supported a group looking at the future of e-learning, and showcased the best of innovation in its e-learning Award. Now for the first time the ELTF membership has come together to undertake an analysis of developments in the member states and to assess what this might mean for the future. The group applied the techniques of World Café conversation and Scenario Thinking to develop its thoughts. The analysis is unashamedly qualitative and draws on expertise from leading universities across eight of the EUNIS member states. What emerges is interesting in terms of the common trends in developments in all of the nations and similarities in hopes and concerns about the future development of learning.
Resumo:
Barnsley College’s level 3 and 4 diplomas in digital learning design are delivered in one year, enabling apprentices to be employed alongside their studies in the college’s innovative learning design company, Elephant Learning Designs. The limited time this allows for delivery and assessment has prompted course leaders to rethink their approach to course structure, assessment and feedback design, and the role of technology in evidence collection.
Resumo:
Os avanços e a disseminação do uso das Tecnologias de Informação e Comunicação (TIC) descortinam novas perspetivas para a educação com suporte em ambientes digitais de aprendizagem usados via internet (Fiolhais & Trindade, 2003). A plataforma usada no Projeto Matemática Ensino (PmatE) da Universidade de Aveiro (UA) é uma das ferramentas informáticas que suporta esses ambientes através da avaliação baseada no Modelo Gerador de Questões (MGQ), possibilitando a obtenção da imagem do progresso feito pelos alunos (Vieira, Carvalho & Oliveira, 2004). Reconhecendo a importância didática desta ferramenta, já demonstrada noutras investigações (por exemplo, Carvalho, 2011; Pais de Aquino, 2013; Peixoto, 2009), o presente estudo tem como objetivo geral desenvolver material didático digital de Física, no contexto do programa moçambicano de Física da 12ª classe, para alunos e professores sobre radiações e conteúdos da Física Moderna. Pretendeu-se, ainda, propor estratégias de trabalho com recurso às TIC para a melhoria da qualidade das aprendizagens nesta disciplina. O estudo assentou nas três seguintes questões de investigação: (a) Como conceber instrumentos de avaliação das aprendizagens baseadas no modelo gerador de questões para o estudo das radiações e conteúdos da Física Moderna, no contexto do programa moçambicano de Física da 12ª classe? (b) Que potencialidades e constrangimentos apresentam esses instrumentos quando implementados com alunos e professores? (c) De que forma o conhecimento construído pode ser mobilizado para outros temas da Física e para o ensino das ciências em geral? O estudo seguiu uma metodologia de Estudos de Desenvolvimento, de natureza mista, que compreendeu as fases da Análise, Design, Desenvolvimento e Avaliação, seguindo como paradigma um estudo de cariz exploratório, com uma vertente de estudo de caso. Assim, na Análise, foi discutido o contexto da educação em Moçambique e a problemática da abordagem das radiações e conteúdos de Física Moderna no ensino secundário no quadro desafiante que se coloca atualmente à educação científica. No Design foram avaliadas as abordagens dasTIC no ensino e aprendizagem da Física e das ciências em geral e construída a árvore de objetivos nos conteúdos referidos na fase anterior. Na fase do Desenvolvimento foram construídos os instrumentos de recolha de dados, elaborados os protótipos de MGQ e sua posterior programação, validação e testagem em formato impresso no estudo exploratório. Na Avaliação, foi conduzido o estudo principal com a aplicação dos modelos no formato digital e feita sua avaliação, o que incluiu a administração de inquéritos por questionário a alunos e professores. Os resultados indicam que na conceção de MGQ, a definição dos objetivos de aprendizagem em termos comportamentais é fundamental na formulação de questões e na análise dos resultados da avaliação com o objetivo de reajustar as estratégias didáticas. Apontam também que a plataforma do PmatE que suporta os MGQ, embora possua constrangimentos devido a sua dependência da internet e limitações de ordem didática, contribui positivamente na aprendizagem e na identificação das dificuldades e principais erros dos alunos, por um lado. Por outro, estimula através da avaliação os processos de assimilação e acomodação do conhecimento. O estudo recomenda a necessidade de mudanças nas práticas de ensino e de aprendizagem para que seja possível a utilização de conteúdos digitais como complemento à abordagem didática de conteúdos.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.
Resumo:
This article explores the lived experiences of two academics in a UK Higher Education Institution who have embedded digital learning approaches within their curriculum delivery. Achieving student excellence can be impeded by a lack of engagement and sense of identity on large courses. Digital learning strategies can offer opportunities to overcome these challenges by empowering students to engage self-confidently. Through an evaluation of the authors’ own experiences of using social media, polling and web-conferencing software, the article shows how interacting with students via a range of learning technologies can create more inclusive and engaging learning environments. Including feedback from students within this article provides evidence that diversification of communication within teaching and learning practice gives students more choice and opportunity to interact with both their peers and teaching staff. The article concludes with recommendations for embedding technology, whilst acknowledging the well-established value of face-to-face interaction.
Resumo:
Deep learning methods are extremely promising machine learning tools to analyze neuroimaging data. However, their potential use in clinical settings is limited because of the existing challenges of applying these methods to neuroimaging data. In this study, first a data leakage type caused by slice-level data split that is introduced during training and validation of a 2D CNN is surveyed and a quantitative assessment of the model’s performance overestimation is presented. Second, an interpretable, leakage-fee deep learning software written in a python language with a wide range of options has been developed to conduct both classification and regression analysis. The software was applied to the study of mild cognitive impairment (MCI) in patients with small vessel disease (SVD) using multi-parametric MRI data where the cognitive performance of 58 patients measured by five neuropsychological tests is predicted using a multi-input CNN model taking brain image and demographic data. Each of the cognitive test scores was predicted using different MRI-derived features. As MCI due to SVD has been hypothesized to be the effect of white matter damage, DTI-derived features MD and FA produced the best prediction outcome of the TMT-A score which is consistent with the existing literature. In a second study, an interpretable deep learning system aimed at 1) classifying Alzheimer disease and healthy subjects 2) examining the neural correlates of the disease that causes a cognitive decline in AD patients using CNN visualization tools and 3) highlighting the potential of interpretability techniques to capture a biased deep learning model is developed. Structural magnetic resonance imaging (MRI) data of 200 subjects was used by the proposed CNN model which was trained using a transfer learning-based approach producing a balanced accuracy of 71.6%. Brain regions in the frontal and parietal lobe showing the cerebral cortex atrophy were highlighted by the visualization tools.
Resumo:
The mechanical control of supragingival biofilm is accepted as one of the most important measures to treat and prevent dental caries and periodontal diseases. Nevertheless, maintaining dental surfaces biofilm-free is not an easy task. In this regard, chemical agents, mainly in the form of mouthwashes, have been studied to help overcome the difficulties involved in the mechanical control of biofilm. The aim of this paper was to discuss proposals for the teaching of supragingival chemical control (SCC) in order to improve dentists' knowledge regarding this clinical issue. Firstly, the literature regarding the efficacy of antiseptics is presented, clearly showing that chemical agents are clinically effective in the reduction of biofilm and gingival inflammation when used as adjuvant agents to mechanical control. Thus, it is suggested that the content related to SCC be included in the curricular grid of dental schools. Secondly, some essential topics are recommended to be included in the teaching of SCC as follows: skills and competencies expected of a graduate dentist regarding SCC; how to include this content in the curricular grid; teaching-learning tools and techniques to be employed; and program content.
Resumo:
Mestrado (PES II), Educação Pré-Escolar e Ensino do 1.º Ciclo do Ensino Básico, 26 de Junho de 2014, Universidade dos Açores.
Resumo:
The great majority of the courses on science and technology areas where lab work is a fundamental part of the apprenticeship was not until recently available to be taught at distance. This reality is changing with the dissemination of remote laboratories. Supported by resources based on new information and communication technologies, it is now possible to remotely control a wide variety of real laboratories. However, most of them are designed specifically to this purpose, are inflexible and only on its functionality they resemble the real ones. In this paper, an alternative remote lab infrastructure devoted to the study of electronics is presented. Its main characteristics are, from a teacher's perspective, reusability and simplicity of use, and from a students' point of view, an exact replication of the real lab, enabling them to complement or finish at home the work started at class. The remote laboratory is integrated in the Learning Management System in use at the school, and therefore, may be combined with other web experiments and e-learning strategies, while safeguarding security access issues.
Resumo:
In the past few years we have witnessed the fast development of distance learning tools such as Open Educational Resources (OER) and Massive Open Online Courses (MOOCs). This paper presents the “Mathematics without STRESS” MOOC Project, which is a cooperation between four schools from the Polytechnic Institute of Oporto (IPP). The concepts of MOOC and their quickly growing popularity are presented and complemented by a discussion of some MOOC definitions. The process of the project development is demonstrated by focusing on used MOOC structure, as well as the several types of course materials produced. At last, is presented a short discussion about problems and challenges met throughout the project. It is also our goal to contribute for a change in the way as teaching and learning Mathematics is seen and practiced nowadays.
Resumo:
In the past few years we have witnessed the fast development of distance learning tools such as Open Educational Resources (OER) and Massive Open Online Courses (MOOCs). This paper presents the “Mathematics without STRESS” MOOC Project, which is a cooperation between four schools from the Polytechnic Institute of Oporto (IPP). The concepts of MOOC and their quickly growing popularity are presented and complemented by a discussion of some MOOC definitions. The process of the project development is demonstrated by focusing on used MOOC structure, as well as the several types of course materials produced. At last, is presented a short discussion about problems and challenges met throughout the project. It is also our goal to contribute for a change in the way as teaching and learning Mathematics is seen and practiced nowadays.