177 resultados para Multiple kernel learning
Resumo:
This paper presents a framework for a robotic production line simulation learning environment using Autonomous Ground Vehicles (AGV). An eLearning platform is used as interface with the simulator. The objective is to introduce students to the production robotics area using a familiar tool, an eLearning platform, and a framework that simulates a production line using AGVs. This framework allows students to learn about robotics but also about several areas of industrial management engineering without requiring an extensive prior knowledge on the robotics area. The robotic production line simulation learning environment simulates a production environment using AGVs to transport materials to and from the production line. The simulator allows students to validate the AGV dynamics and provides information about the whole materials supplying system which includes: supply times, route optimization and inventory management. The students are required to address several topics such as: sensors, actuators, controllers and an high level management and optimization software. This simulator was developed with a known open source tool from robotics community: Player/Stage. This tool was extended with several add-ons so that students can be able to interact with a complex simulation environment. These add-ons include an abstraction communication layer that performs events provided by the database server which is programmed by the students. An eLearning platform is used as interface between the students and the simulator. The students can visualize the effects of their instructions/programming in the simulator that they can access via the eLearning platform. The proposed framework aims to allow students from different backgrounds to fully experience robotics in practice by suppressing the huge gap between theory and practice that exists in robotics. Using an eLearning platform eliminates installation problems that can occur from different computers software distribution and makes the simulator accessible by all students at school and at home.
Resumo:
Currently the world around us "reboots" every minute and “staying at the forefront” seems to be a very arduous task. The continuous and “speeded” progress of society requires, from all the actors, a dynamic and efficient attitude both in terms progress monitoring and moving adaptation. With regard to education, no matter how updated we are in relation to the contents, the didactic strategies and technological resources, we are inevitably compelled to adapt to new paradigms and rethink the traditional teaching methods. It is in this context that the contribution of e-learning platforms arises. Here teachers and students have at their disposal new ways to enhance the teaching and learning process, and these platforms are seen, at the present time, as significant virtual teaching and learning supporting environments. This paper presents a Project and attempts to illustrate the potential that new technologies present as a “backing” tool in different stages of teaching and learning at different levels and areas of knowledge, particularly in Mathematics. We intend to promote a constructive discussion moment, exposing our actual perception - that the use of the Learning Management System Moodle, by Higher Education teachers, as supplementary teaching-learning environment for virtual classroom sessions can contribute for greater efficiency and effectiveness of teaching practice and to improve student achievement. Regarding the Learning analytics experience we will present a few results obtained with some assessment Learning Analytics tools, where we profoundly felt that the assessment of students’ performance in online learning environments is a challenging and demanding task.
Resumo:
Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.
Resumo:
Existing gamification services have features that preclude their use by e-learning tools. Odin is a gamification service that mimics the API of state-of-the-art services without these limitations. This paper describes Odin, its role in an e-learning system architecture requiring gamification, and details its implementation. The validation of Odin involved the creation of a small e-learning game, integrated in a Learning Management System (LMS) using the Learning Tools Interoperability (LTI) specification.
Resumo:
Currently, a learning management system (LMS) plays a central role in any e-learning environment. These environments include systems to handle the pedagogic aspects of the teaching–learning process (e.g. specialized tutors, simulation games) and the academic aspects (e.g. academic management systems). Thus, the potential for interoperability is an important, although over looked, aspect of an LMS. In this paper, we make a comparative study of the interoperability level of the most relevant LMS. We start by defining an application and a specification model. For the application model, we create a basic application that acts as a tool provider for LMS integration. The specification model acts as the API that the LMS should implement to communicate with the tool provider. Based on researches, we select the Learning Tools Interoperability (LTI) from IMS. Finally, we compare the LMS interoperability level defined as the effort made to integrate the application on the study LMS.
Resumo:
A presente comunicação visa discutir as mais-valias de um desenho metodológico sustentado numa abordagem conceptual da Terminologia aplicado ao exercício de harmonização da definição do cenário educativo mais promissor do Ensino Superior actual: o blended learning. Sendo a Terminologia uma disciplina que se ocupa da representação, da descrição e da definição do conhecimento especializado através da língua a essência deste domínio do saber responde a uma necessidade fundamental da sociedade actual: putting order into our universe, nas palavras de Nuopponen (2011). No contexto descrito, os conceitos, enquanto elementos da estrutura do conhecimento (Sager, 1990) constituem um objecto de investigação de complexidade não despicienda, pois apesar do postulado de que a língua constitui uma ferramenta fundamental para descrever e organizar o conhecimento, o princípio isomórfico não pode ser tomado como adquirido. A abordagem conceptual em Terminologia propõe uma visão precisa do papel da língua no trabalho terminológico, sendo premissa basilar que não existe uma correspondência unívoca entre os elementos atomísticos do conhecimento e os elementos da expressão linguística. É pela razões enunciadas que as opções metodológicas circunscritas à análise do texto de especialidade serão consideradas imprecisas. Nesta reflexão perspectiva-se que o conceito-chave de uma abordagem conceptual do trabalho terminológico implica a combinação de um processo de elicitação do conhecimento tácito através de uma negociação discursiva orientada para o conceito e a análise de corpora textuais. Defende-se consequentemente que as estratégias de interacção entre terminólogo e especialista de domínio merecem atenção detalhada pelo facto de se reflectirem com expressividade na qualidade dos resultados obtidos. Na sequência do exposto, o modelo metodológico que propomos sustenta-se em três etapas que privilegiam um refinamento dessa interacção permitindo ao terminólogo afirmar-se como sujeito conceptualizador, decisor e interventor: (1) etapa exploratória do domínio-objecto de estudo; (2) etapa de análise onamasiológica de evidência textual e discursiva; (3) etapa de modelização e de validação de resultados. Defender-se-á a produtividade de uma sequência cíclica entre a análise textual e discursiva para fins onomasiológicos, a interacção colaborativa e a introspecção.
Resumo:
A Declaração de Bolonha (1999) obrigou a mudanças várias, reconfigurando os modelos formativos no espaço europeu do ensino superior, até 2010. A partir de 2006, em Portugal, com a criação e adequação dos cursos superiores existentes ao modelo de Bolonha, verificou-se uma generalizada redução da duração média dos diferentes ciclos de estudo e a definição de competências gerais e específicas para os cursos e estudantes. Reflecte-se sobre a importância da literacia da informação, conceito evolutivo e abrangente, que se pode traduzir, sumariamente, em saber quando e porquê se tem uma necessidade informacional, onde encontrar a informação, como avaliá-la, usá-la e comunicá-la de forma ética, incluindo as competências tecnológicas, definição que se inscreve na interdisciplinar Ciência da Informação e no comportamento informacional. Destaca-se a vantagem de uma formação para a literacia da informação no ensino superior, a qual contribuirá, certamente, para dotar os estudantes das referidas competências e melhorá-las. Defende-se a necessidade de uma desejável inter-acção entre múltiplos agentes educativos, com destaque para a trilogia estudantes, bibliotecários e professores, sendo os primeiros encarados como protagonistas activos das suas aprendizagens e devendo ser dotados de competências de literacia da informação, factor determinante para o seu sucesso. Quanto ao Bibliotecário, dotado de novas competências, entre as quais as tecnológicas, deve ser um facilitador do processo de formação para a literacia - preferencialmente integrada num projecto pedagógico e no currículo - articulando a sua acção educativa com estudantes e docentes. Corroborando a extensão educativa das Bibliotecas e aliando-a ao uso inevitável das novas tecnologias da informação e comunicação, sublinha-se o papel das Bibliotecas Digitais, que podem corresponder eficientemente aos anseios dos utilizadores no acesso a uma informação de qualidade, de forma cómoda, rápida, a baixo custo, com personalização dos serviços online, com inter-acção e socialização, através de ferramentas de edição colaborativa, típicas da Web 2.0.
Resumo:
The development of nations depends on energy consumption, which is generally based on fossil fuels. This dependency produces irreversible and dramatic effects on the environment, e.g. large greenhouse gas emissions, which in turn cause global warming and climate changes, responsible for the rise of the sea level, floods, and other extreme weather events. Transportation is one of the main uses of energy, and its excessive fossil fuel dependency is driving the search for alternative and sustainable sources of energy such as microalgae, from which biodiesel, among other useful compounds, can be obtained. The process includes harvesting and drying, two energy consuming steps, which are, therefore, expensive and unsustainable. The goal of this EPS@ISEP Spring 2013 project was to develop a solar microalgae dryer for the microalgae laboratory of ISEP. A multinational team of five students from distinct fields of study was responsible for designing and building the solar microalgae dryer prototype. The prototype includes a control system to ensure that the microalgae are not destroyed during the drying process. The solar microalgae dryer works as a distiller, extracting the excess water from the microalgae suspension. This paper details the design steps, the building technologies, the ethical and sustainable concerns and compares the prototype with existing solutions. The proposed sustainable microalgae drying process is competitive as far as energy usage is concerned. Finally, the project contributed to increase the deontological ethics, social compromise skills and sustainable development awareness of the students.
Resumo:
This paper presents a decision support methodology for electricity market players’ bilateral contract negotiations. The proposed model is based on the application of game theory, using artificial intelligence to enhance decision support method’s adaptive features. This model is integrated in AiD-EM (Adaptive Decision Support for Electricity Markets Negotiations), a multi-agent system that provides electricity market players with strategic behavior capabilities to improve their outcomes from energy contracts’ negotiations. Although a diversity of tools that enable the study and simulation of electricity markets has emerged during the past few years, these are mostly directed to the analysis of market models and power systems’ technical constraints, making them suitable tools to support decisions of market operators and regulators. However, the equally important support of market negotiating players’ decisions is being highly neglected. The proposed model contributes to overcome the existing gap concerning effective and realistic decision support for electricity market negotiating entities. The proposed method is validated by realistic electricity market simulations using real data from the Iberian market operator—MIBEL. Results show that the proposed adaptive decision support features enable electricity market players to improve their outcomes from bilateral contracts’ negotiations.
Resumo:
The vision of the Internet of Things (IoT) includes large and dense deployment of interconnected smart sensing and monitoring devices. This vast deployment necessitates collection and processing of large volume of measurement data. However, collecting all the measured data from individual devices on such a scale may be impractical and time consuming. Moreover, processing these measurements requires complex algorithms to extract useful information. Thus, it becomes imperative to devise distributed information processing mechanisms that identify application-specific features in a timely manner and with a low overhead. In this article, we present a feature extraction mechanism for dense networks that takes advantage of dominance-based medium access control (MAC) protocols to (i) efficiently obtain global extrema of the sensed quantities, (ii) extract local extrema, and (iii) detect the boundaries of events, by using simple transforms that nodes employ on their local data. We extend our results for a large dense network with multiple broadcast domains (MBD). We discuss and compare two approaches for addressing the challenges with MBD and we show through extensive evaluations that our proposed distributed MBD approach is fast and efficient at retrieving the most valuable measurements, independent of the number sensor nodes in the network.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
This article deals with a real-life waste collection routing problem. To efficiently plan waste collection, large municipalities may be partitioned into convenient sectors and only then can routing problems be solved in each sector. Three diverse situations are described, resulting in three different new models. In the first situation, there is a single point of waste disposal from where the vehicles depart and to where they return. The vehicle fleet comprises three types of collection vehicles. In the second, the garage does not match any of the points of disposal. The vehicle is unique and the points of disposal (landfills or transfer stations) may have limitations in terms of the number of visits per day. In the third situation, disposal points are multiple (they do not coincide with the garage), they are limited in the number of visits, and the fleet is composed of two types of vehicles. Computational results based not only on instances adapted from the literature but also on real cases are presented and analyzed. In particular, the results also show the effectiveness of combining sectorization and routing to solve waste collection problems.