12 resultados para Adaptive learning platform
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The goal of the study was to evaluate an e-learning course entitled “Nursing interventions to manage distressed and disturbed patients” and intended for psychiatric nurses, using Kirkpatrick’s evaluation model. The aim was to describe nurses’ reactions, learning, behaviour change and impacts resulting from this e-learning course. This dissertation comprises four papers, and the data were collected 2008-2012 from three different sources; electronic databases, an e-learning platform and psychiatric hospitals. First, a systematic literature review was conducted to understand the effectiveness of e-learning. Second, an RCT study was implemented to investigate the impact of the e-learning course on nurses’ job-satisfaction, knowledge and attitudes (N=158). Third, to complete the picture of nurses views of the e-learning course related to knowledge transfer, the nurses’ perspective was studied (N=33). Lastly, the effects of the e-learning course from nursing managers’ perspective in psychiatric hospital organisations were studied (N=28). The systematic review showed that although the nurses were satisfied with the e-learning, no effects were found in the RCT study of nurses’ job satisfaction. The RCT study showed no effects on nurses’ learning related to knowledge increase, but there was change in attitudes. The managers described the changes in the nurses’ knowledge and attitudes. Among the nurses behaviour changed with knowledge transfer from the e-learning course to practice and they pointed out development issues related to their work. The final impacts of the e-learning course revealed advantages and disadvantages of the e-learning course and its implications for nurses’ work. This dissertation provides new insight into nurses’ reactions, learning, behaviour change and impacts resulting from an e-learning course in their continuing education. In order to improve nurses’ continuing education systematic evaluation is needed, for which Kirkpatrick’s evaluation model is a useful tool.
Resumo:
Massive Open Online Courses have been in the center of attention in the recent years. However, the main problem of all online learning environments is their lack of personalization according to the learners’ knowledge, learning styles and other learning preferences. This research explores the parameters and features used for personalization in the literature and based on them, evaluates to see how well the current MOOC platforms have been personalized. Then, proposes a design framework for personalization of MOOC platforms that fulfills most of the personalization parameters in the literature including the learning style as well as personalization features. The result of an assessment made for the proposed design framework shows that the framework well supports personalization of MOOCs.
Resumo:
Tämän tutkimuksen kohteena on Turun yliopistossa kehitetyn www-pohjaisen ViLLE-oppimisjärjestelmän funktionaalisen käyttöliittymätestauksen automatisointityö. Tutkimusta varten olen kerännyt kattavasti aineistoa aihetta käsittelevästä yleisestä kirjallisuudesta ja artikkeleista sekä toteutuksen kannalta spesifistä tietoa tarjoavista Internet-lähteistä. Tutkimuksessa olen tehnyt myös pienehkön määrän testausalan asiantuntijahaastatteluja. Tutkimuksen empiirisessä osuudessa olen valinnut testaukseen käytettävän testaustyökalun sekä toteuttanut valitulla testaustyökalulla ViLLE-oppimisjärjestelmän testauksen automatisointityön soveltamalla käytäntöön tutkimuksen teoriaosuudessa esitettyä tietoa hyvistä käytänteistä funktionaalisen käyttöliittymätestauksen automatisoinnissa. Tutkimuksen toteutuksessa olen käyttänyt kvalitatiivista tutkimusmenetelmää. Tutkimuksen empiirisen osuuden pohjalta kerätyn havaintoaineiston perusteella olen selvittänyt vastaukset seuraaviin tutkimuksessa esitettyihin tutkimuskysymyksiin: • Miten käytetty testaustyökalu on valittu ja mitkä olivat valintaan vaikuttaneet tärkeimmät kriteerit? • Miten käytetty testaustyökalu soveltuu ViLLE-oppimisjärjestelmän funktionaalisen käyttöliittymätestauksen automatisointiin? • Millä eri tavoin käytäntöön viety hyvien testiautomaation laatimistapojen mukainen toteutus vaikuttaa nyt laadittuun testiautomaatioon? • Esiintyikö toteutetussa testiautomaatiossa tutkimuksen teoreettisessa viitekehyksessä kuvattuja funktionaalisen käyttöliittymätestauksen automatisoinnille tyypillisiä ongelmia ja miten ongelmat saatiin ratkaistua? Tutkimuksen tulokset osoittavat melko selvästi, että ViLLE-oppimisympäristön funktionaalisen käyttöliittymätestauksen automatisointityön toteutukseen valittu Vaadin TestBench -testaustyökalu, joka on valittu tutkimuksen alkuvaiheessa suoritetun evaluoinnin perusteella, soveltuu käyttötarkoitukseensa hyvin. Lisäksi pystyin luotettavasti havainnoimaan, että testiautomaation ylläpidon tarve sekä testien laatimiseen kuluva aika vähenevät merkittävästi, kun testit laaditaan heti alusta lähtien rakenteeltaan modulaariseksi sekä tietyin teknisin keinoin mahdollisimman vähän käyttöliittymän rakennetta huomioonottavaksi. Ongelmia testiautomaation laatimisessa voivat aiheuttaa käytetty työkalu itsessään, testattavan järjestelmän toteutus sekä testien suoritusympäristö. Huolimatta kirjallisuuskatsauksen perusteella tehdystä varautumisesta tyypillisiin testiautomaation laatimisessa esiintyviin ongelmiin, myös joitakin sellaisia ongelmia esiintyi, joihin en ollut osannut varautua. Mahdollisiin ongelmiin etukäteen varautuminen kuitenkin selvästi auttoi suurimpaan osaan testiautomaation laatimisessa esiintyneistä ongelmista.
Resumo:
This thesis is a research about the recent complex spatial changes in Namibia and Tanzania and local communities’ capacity to cope with, adapt to and transform the unpredictability engaged to these processes. I scrutinise the concept of resilience and its potential application to explaining the development of local communities in Southern Africa when facing various social, economic and environmental changes. My research is based on three distinct but overlapping research questions: what are the main spatial changes and their impact on the study areas in Namibia and Tanzania? What are the adaptation, transformation and resilience processes of the studied local communities in Namibia and Tanzania? How are innovation systems developed, and what is their impact on the resilience of the studied local communities in Namibia and Tanzania? I use four ethnographic case studies concerning environmental change, global tourism and innovation system development in Namibia and Tanzania, as well as mixed-methodological approaches, to study these issues. The results of my empirical investigation demonstrate that the spatial changes in the localities within Namibia and Tanzania are unique, loose assemblages, a result of the complex, multisided, relational and evolutional development of human and non-human elements that do not necessarily have linear causalities. Several changes co-exist and are interconnected though uncertain and unstructured and, together with the multiple stressors related to poverty, have made communities more vulnerable to different changes. The communities’ adaptation and transformation measures have been mostly reactive, based on contingency and post hoc learning. Despite various anticipation techniques, coping measures, adaptive learning and self-organisation processes occurring in the localities, the local communities are constrained by their uneven power relationships within the larger assemblages. Thus, communities’ own opportunities to increase their resilience are limited without changing the relations in these multiform entities. Therefore, larger cooperation models are needed, like an innovation system, based on the interactions of different actors to foster cooperation, which require collaboration among and input from a diverse set of stakeholders to combine different sources of knowledge, innovation and learning. Accordingly, both Namibia and Tanzania are developing an innovation system as their key policy to foster transformation towards knowledge-based societies. Finally, the development of an innovation system needs novel bottom-up approaches to increase the resilience of local communities and embed it into local communities. Therefore, innovation policies in Namibia have emphasised the role of indigenous knowledge, and Tanzania has established the Living Lab network.
Resumo:
This thesis examines the history and evolution of information system process innovation (ISPI) processes (adoption, adaptation, and unlearning) within the information system development (ISD) work in an internal information system (IS) department and in two IS software house organisations in Finland over a 43-year time-period. The study offers insights into influential actors and their dependencies in deciding over ISPIs. The research usesa qualitative research approach, and the research methodology involves the description of the ISPI processes, how the actors searched for ISPIs, and how the relationships between the actors changed over time. The existing theories were evaluated using the conceptual models of the ISPI processes based on the innovationliterature in the IS area. The main focus of the study was to observe changes in the main ISPI processes over time. The main contribution of the thesis is a new theory. The term theory should be understood as 1) a new conceptual framework of the ISPI processes, 2) new ISPI concepts and categories, and the relationships between the ISPI concepts inside the ISPI processes. The study gives a comprehensive and systematic study on the history and evolution of the ISPI processes; reveals the factors that affected ISPI adoption; studies ISPI knowledge acquisition, information transfer, and adaptation mechanisms; and reveals the mechanismsaffecting ISPI unlearning; changes in the ISPI processes; and diverse actors involved in the processes. The results show that both the internal IS department and the two IS software houses sought opportunities to improve their technical skills and career paths and this created an innovative culture. When new technology generations come to the market the platform systems need to be renewed, and therefore the organisations invest in ISPIs in cycles. The extent of internal learning and experiments was higher than the external knowledge acquisition. Until the outsourcing event (1984) the decision-making was centralised and the internalIS department was very influential over ISPIs. After outsourcing, decision-making became distributed between the two IS software houses, the IS client, and itsinternal IT department. The IS client wanted to assure that information systemswould serve the business of the company and thus wanted to co-operate closely with the software organisations.
Resumo:
Tämä diplomityö kuvaa viestintä sovelluksen ytimen kehitystyön Symbian-alustalle. Koko sovelluksen vaatimuksena oli vastaamattomiin puheluihin vastaaminen ennalta määritellyillä tekstiviesteillä käyttäjän määrittelemien sääntöjen mukaisesti. Ei-toiminnallisia vaatimuksia olivat resurssien käytön vähentäminen ja uudelleenkäytön mahdollistaminen. Täten tämän työn tavoitteena oli kehittää ydin, joka kapseloi sovelluksen sellaisen toiminnallisuuden, joka on käyttöliittymästä riippumatonta ja uudelleenkäytettävää. Kehitystyössä ohjasi Unified Process, joka on iteroiva, käyttötapauksien ohjaama ja arkkitehtuurikeskeinen ohjelmistoprosessi. Se kannusti käyttämään myös muita teollisuudenalan vakiintuneita menetelmiä, kuten suunnittelumalleja ja visuaalista mallintamista käyttäen Unified Modelling Languagea. Suunnittelumalleja käytettiin kehitystyön aikana ja ohjelmisto mallinnettiin visuaalisesti suunnittelun edistämiseksi ja selkiyttämiseksi. Alustan palveluita käytettiin hyväksi kehitysajan ja resurssien käytön minimoimiseksi. Ytimen päätehtäviksi määrättiin viestien lähettäminen sekä sääntöjen talletus ja tarkistaminen. Sovelluksen eri alueet, eli sovelluspalvelin ja käyttöliittymää, pystyivät käyttämään ydintä ja sillä ei ollut riippuvuuksia käyttöliittymätasolle. Täten resurssien käyttö väheni ja uudelleenkäytettävyys lisääntyi. Viestien lähettäminen toteutettiin Symbian-alustan menetelmin. Sääntöjen tallettamiseen tehtiin tallennuskehys, joka eristää sääntöjen sisäisen ja ulkoisen muodon. Tässä tapauksessa ulkoiseksi tallennustavaksi valittiin relaatiotietokanta. Sääntöjen tarkastaminen toteutettiin tavanomaisella olioiden yhteistoiminnalla. Päätavoite saavutettiin. tämä ja muut hyviksi arvioidut lopputulokset, kuten uudelleenkäytettävyys ja vähentynyt resurssien käyttö, arveltiin juontuvan suunnittelumallien ja Unified Processin käytöstä. Kyseiset menetelmät osoittivat mukautuvansa pieniinkin projekteihin. Menetelmien todettiin myös tukevan ja kannustavan kehitystyön aikaista oppimista, mikä oli välttämätöntä tässä tapauksessa.
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
The new product development process is a massive investment to a company that aims to reduce their products’ time-to-market. Capability to shorter time-to market allows longer life-cycle to products which are introduced to market earlier but also give advantage to start product launch later while simultaneously learning from customer behavior and competitors. The product launch support operations are the last ramp-up activities before the product launching. This study defines what these operations mean in a product platform and how they can be streamlined to be more efficient. The methodology includes interviews, innovative group brainstorming and regular working group meetings. The challenges concerning the current situation of product launch support operations are allocated into four categories: General, Process, Project Resources and Project Management including altogether ten sub challenges. The challenges include issues related to technology and marketing management, branding strategy, organizing the global platform structure, harmonizing processes and clarifying handovers between shareholders in the process. The study makes a suggestion of a new Product Launch Support organization and clarification of its roles, responsibilities and tasks. In addition a new project management tool and Lessons Learned are suggested to improve the project management. The study can be seen as a pre-study when having an aim at combining technological and marketing know-how in the product ramp-up process before actual production. The future proceedings are suggested to include more detailed specifications and implementation in order to reach the long range target, reduced the time-to-market.
Resumo:
Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.
Resumo:
The portfolio as a means of demonstrating personal skills has lately been gaining prominence among technology students. This is partially due to the introduction of electronic portfolios, or e-portfolios. As platforms for e-portfolio management with different approaches have been introduced, the learning cycle, traditional portfolio pedagogy, and learner centricity have sometimes been forgotten, and as a result, the tools have been used for the most part as data depositories. The purpose of this thesis is to show how the construction of e-portfolios of IT students can be supported by institutions through the usage of different tools that relate to study advising, teaching, and learning. The construction process is presented as a cycle based on learning theories. Actions related to the various phases of the e-portfolio construction process are supported by the implementation of software applications. To maximize learner-centricity and minimize the intervention of the institution, the evaluated and controlled actions for these practices can be separated from the e-portfolios, leaving the construction of the e-portfolio to students. The main contributions of this thesis are the implemented applications, which can be considered to support the e-portfolio construction by assisting in planning, organizing, and reflecting activities. Eventually, this supports the students in their construction of better and more extensive e-portfolios. The implemented tools include 1) JobSkillSearcher to help students’ recognition of the demands of the ICT industry regarding skills, 2) WebTUTOR to support students’ personal study planning, 3) Learning Styles to determine students' learning styles, and 4) MyPeerReview to provide a platform on which to carry out anonymous peer review processes in courses. The most visible outcome concerning the e-portfolio is its representation, meaning that one can use it to demonstrate personal achievements at the time of seeking a job and gaining employment. Testing the tools and the selected open-source e-portfolio application indicates that the degree of richness of e-portfolio content can be increased by using the implemented applications.
Resumo:
This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.