76 resultados para Logic design.
Resumo:
An adaptive control damping the forced vibration of a car while passing along a bumpy road is investigated. It is based on a simple kinematic description of the desired behavior of the damped system. A modified PID controller containing an approximation of Caputo’s fractional derivative suppresses the high-frequency components related to the bumps and dips, while the low frequency part of passing hills/valleys are strictly traced. Neither a complete dynamic model of the car nor ’a priori’ information on the surface of the road is needed. The adaptive control realizes this kinematic design in spite of the existence of dynamically coupled, excitable internal degrees of freedom. The method is investigated via Scicos-based simulation in the case of a paradigm. It was found that both adaptivity and fractional order derivatives are essential parts of the control that can keep the vibration of the load at bay without directly controlling its motion.
Resumo:
Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.
Resumo:
Further improvements in demand response programs implementation are needed in order to take full advantage of this resource, namely for the participation in energy and reserve market products, requiring adequate aggregation and remuneration of small size resources. The present paper focuses on SPIDER, a demand response simulation that has been improved in order to simulate demand response, including realistic power system simulation. For illustration of the simulator’s capabilities, the present paper is proposes a methodology focusing on the aggregation of consumers and generators, providing adequate tolls for the demand response program’s adoption by evolved players. The methodology proposed in the present paper focuses on a Virtual Power Player that manages and aggregates the available demand response and distributed generation resources in order to satisfy the required electrical energy demand and reserve. The aggregation of resources is addressed by the use of clustering algorithms, and operation costs for the VPP are minimized. The presented case study is based on a set of 32 consumers and 66 distributed generation units, running on 180 distinct operation scenarios.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
The Internet of Things (IoT) has emerged as a paradigm over the last few years as a result of the tight integration of the computing and the physical world. The requirement of remote sensing makes low-power wireless sensor networks one of the key enabling technologies of IoT. These networks encompass several challenges, especially in communication and networking, due to their inherent constraints of low-power features, deployment in harsh and lossy environments, and limited computing and storage resources. The IPv6 Routing Protocol for Low Power and Lossy Networks (RPL) [1] was proposed by the IETF ROLL (Routing Over Low-power Lossy links) working group and is currently adopted as an IETF standard in the RFC 6550 since March 2012. Although RPL greatly satisfied the requirements of low-power and lossy sensor networks, several issues remain open for improvement and specification, in particular with respect to Quality of Service (QoS) guarantees and support for mobility. In this paper, we focus mainly on the RPL routing protocol. We propose some enhancements to the standard specification in order to provide QoS guarantees for static as well as mobile LLNs. For this purpose, we propose OF-FL (Objective Function based on Fuzzy Logic), a new objective function that overcomes the limitations of the standardized objective functions that were designed for RPL by considering important link and node metrics, namely end-to-end delay, number of hops, ETX (Expected transmission count) and LQL (Link Quality Level). In addition, we present the design of Co-RPL, an extension to RPL based on the corona mechanism that supports mobility in order to overcome the problem of slow reactivity to frequent topology changes and thus providing a better quality of service mainly in dynamic networks application. Performance evaluation results show that both OF-FL and Co-RPL allow a great improvement when compared to the standard specification, mainly in terms of packet loss ratio and average network latency. 2015 Elsevier B.V. Al
Resumo:
For efficient planning of waste collection routing, large municipalities may be partitioned into convenient sectors. The real case under consideration is the municipality of Monção, in Portugal. Waste collection involves more than 1600 containers over an area of 220 km2 and a population of around 20,000 inhabitants. This is mostly a rural area where the population is distributed in small villages around the 33 boroughs centres (freguesia) that constitute the municipality. In most freguesias, waste collection is usually conducted 3 times a week. However, there are situations in which the same collection is done every day. The case reveals some general and specific characteristics which are not rare, but are not widely addressed in the literature. Furthermore, new methods and models to deal with sectorization and routing are introduced, which can be extended to other applications. Sectorization and routing are tackled following a three-phase approach. The first phase, which is the main concern of the presentation, introduces a new method for sectorization inspired by Electromagnetism and Coulomb’s Law. The matter is not only about territorial division, but also the frequency of waste collection, which is a critical issue in these types of applications. Special characteristics related to the number and type of deposition points were also a motivation for this work. The second phase addresses the routing problems in each sector: new Mixed Capacitated Arc Routing with Limited Multi-Landfills models will be presented. The last phase integrates Sectoring and Routing. Computational results confirm the effectiveness of the entire novel approach.
Resumo:
No nosso contexto hiper-moderno, há factores que influenciam a nossa percepção e consequente grau de afeição ou rejeição para com as entidades. Pese embora o facto de que a marca gráfica (desvinculada do signo tipográfico) ser uma parcela dentro do conjunto maior do design de identidade, esta merece ser estudada na actual conjuntura: uma conjuntura onde está presente uma crise financeira e social, com altos níveis de desemprego e situações que, embora não sendo nefastas, não devem ser ignoradas – como é o caso da crescente criação de identidades em outsourcing ou da maior acessibilidade aos computadores e processos criativos por parte de designers e não designers. O nosso principal objectivo é o de criar um um manual de práticas compilando uma gramática da marca gráfica que a explicite aos jovens designers. Já verificamos que esta ferramenta pode ser transferida para conceber e analisar outros artefactos gráficos. Pretendemos examinar se são válidos na contemporaneidade os axiomas da gestalt ou os preceitos formulados nos anos 50/60 como os de Jacques Bertin, continuando a ser uma boa base no processo de criação e percepção das marcas gráficas. Com a mudança de paradigmas que as novas tecnologias forjaram no zeitgeist do design e na actual conjuntura, procuramos confirmar se esses preceitos são válidos tanto nas marcas intemporais como nas marcas contemporâneas e fluídas, e criar uma ferramenta pedagógica que contribua para implementar uma literacia visual que descodifique este signo icónico, dentro desta nova realidade. Nela, com o acesso aos novos meios digitais, de interacção e colaboração, surgem expressões emotivas por parte de designers, estudantes e utilizadores nos sites da especialidade, sobre marcas gráficas, provando o interesse no objectivo deste projecto em curso.
Resumo:
Optimization methods have been used in many areas of knowledge, such as Engineering, Statistics, Chemistry, among others, to solve optimization problems. In many cases it is not possible to use derivative methods, due to the characteristics of the problem to be solved and/or its constraints, for example if the involved functions are non-smooth and/or their derivatives are not know. To solve this type of problems a Java based API has been implemented, which includes only derivative-free optimization methods, and that can be used to solve both constrained and unconstrained problems. For solving constrained problems, the classic Penalty and Barrier functions were included in the API. In this paper a new approach to Penalty and Barrier functions, based on Fuzzy Logic, is proposed. Two penalty functions, that impose a progressive penalization to solutions that violate the constraints, are discussed. The implemented functions impose a low penalization when the violation of the constraints is low and a heavy penalty when the violation is high. Numerical results, obtained using twenty-eight test problems, comparing the proposed Fuzzy Logic based functions to six of the classic Penalty and Barrier functions are presented. Considering the achieved results, it can be concluded that the proposed penalty functions besides being very robust also have a very good performance.
Resumo:
In this paper, a module for homograph disambiguation in Portuguese Text-to-Speech (TTS) is proposed. This module works with a part-of-speech (POS) parser, used to disambiguate homographs that belong to different parts-of-speech, and a semantic analyzer, used to disambiguate homographs which belong to the same part-of-speech. The proposed algorithms are meant to solve a significant part of homograph ambiguity in European Portuguese (EP) (106 homograph pairs so far). This system is ready to be integrated in a Letter-to-Sound (LTS) converter. The algorithms were trained and tested with different corpora. The obtained experimental results gave rise to 97.8% of accuracy rate. This methodology is also valid for Brazilian Portuguese (BP), since 95 homographs pairs are exactly the same as in EP. A comparison with a probabilistic approach was also done and results were discussed.
Resumo:
The inter-disciplinarity of information systems, applied discipline and activity of design, and the study from different paradigms perspectives explains the diversity of problems addressed. The context is broad and includes important issues beyond technology, as the application, use, effectiveness, efficiency and their organizational and social impacts. In design science, the research interest is in contributing to the improvement of the processes of the design activity itself. The relevance of research in design science is associated with the result obtained for the improvement of living conditions in organizational, inter-organizational and Society contexts. In the research whose results are artifacts, the adoption of design research as a process of research is crucial to ensure discipline, rigor and transparency. Based on a literature review, this paper clarifies the terms of design science and design research. This is the main motivation for presenting this paper, determinant for the phase in research in technologies and information systems which are the three research projects presented. As a result the three projects are discussed in relation to the concepts of design science and design research.
Resumo:
In a scientific research project is important to define the underlying philosophical orientation of the project, because this will influence the choices made in respect of scientific methods used, as well as the way they will be applied. It is crucial, therefore, that the philosophy and research design strategy are consistent with each other. These questions become even more relevant in qualitative research. Historically, the interpretive research philosophy is more associated to the scientific areas of social sciences and humanities where the subjectivity inherent to human intervention is more explicitly defined. Information systems field are, primarily, trapped in computer science field, though it also integrates issues related with management and organizations field. This shift from a purely technological guidance for the consideration of the problems of management and organizations has fostered the rise of research projects according to the interpretive philosophy and using qualitative methods. This paper explores the importance of alignment between the epistemological orientation and research design strategy, in qualitative research projects. As a result, it is presented two PhD projects, with different research design strategies, that are being developed in the technology and information systems field, in the light of the interpretive paradigm.
Resumo:
Um Sistema de Informação consiste num sistema capaz de armazenar, organizar e estruturar dados para ajudar a responder às necessidades das empresas, passando também pela capacidade de resposta às questões diárias das empresas. Assim, um Sistema de Informação pode ser definido como o software que ajuda a organizar e analisar dados, tendo como objetivo fornecer informação útil na altura certa para que possa ser utilizada para a tomada de decisões ou para uma gestão mais eficiente dos diversos fluxos que uma empresa pode conter. Neste sentido, o projeto apresentado centra-se no desenho e construção de um Sistema de Informação capaz de gerir o negócio de uma empresa do setor alimentar, mais propriamente do setor da transformação de carnes. Foi desenvolvido em Oracle ADF, de forma a aproveitar as vantagens inerentes à tecnologia e ao desenvolvimento web. Sendo uma tecnologia relativamente nova no mercado e dominada por poucos, a sua utilização neste momento pode tornar-se uma grande vantagem. Para o desenvolvimento da aplicação foi realizado o levantamento e análise de requisitos, foi criada a base de dados capaz de suportar o funcionamento do software e desenvolvido um sistema de login, capaz de gerir as sessões de cada utilizador. Foi implementado um processo de introdução e edição de informação, nomeadamente o registo de entradas, transformações e saídas. Contemplou-se também uma secção com dados mestre da empresa com a possibilidade de inserção, atualização e/ou remoção. Além disso, foram incorporadas validações em todos os processos que são usados pelos utilizadores, de modo a evitar a existência de dados incoerentes ou duplicados. Relativamente à lógica de negócio, foi embutida na aplicação de forma a permitir consultar a informação de forma clara, rápida e em diversos lugares, reduzindo tempo e tarefas ao colaborador/utilizador, visto que os processos foram automatizados. Com a implementação deste Sistema de Informação, a empresa pode usufruir de um sistema integrado capaz de gerir e controlar todo o seu processo produtivo, reduzindo custos e desperdícios, aumentando a produtividade e eficiência.
Resumo:
A crescente evolução dos dispositivos contendo circuitos integrados, em especial os FPGAs (Field Programmable Logic Arrays) e atualmente os System on a chip (SoCs) baseados em FPGAs, juntamente com a evolução das ferramentas, tem deixado um espaço entre o lançamento e a produção de materiais didáticos que auxiliem os engenheiros no Co- Projecto de hardware/software a partir dessas tecnologias. Com o intuito de auxiliar na redução desse intervalo temporal, o presente trabalho apresenta o desenvolvimento de documentos (tutoriais) direcionados a duas tecnologias recentes: a ferramenta de desenvolvimento de hardware/software VIVADO; e o SoC Zynq-7000, Z-7010, ambos desenvolvidos pela Xilinx. Os documentos produzidos são baseados num projeto básico totalmente implementado em lógica programável e do mesmo projeto implementado através do processador programável embarcado, para que seja possível avaliar o fluxo de projeto da ferramenta para um projeto totalmente implementado em hardware e o fluxo de projeto para o mesmo projeto implementado numa estrutura de harware/software.
Resumo:
Adventure! The Paladin Order foi um projecto ambicioso que começou por ser desenvolvido como um video jogo completo. Tinha como objéctivo implementar uma ferramenta diferente que permitisse tornar o jogo completamente adaptativo às decisões do jogador tanto nas interacções e no diálogo com outras personagens, assim como no combate contra os variados inímigos do jogo. Devido à inexperiência do autor uma grande parte do tempo foi passado a estudar e a pesquisar várias possíveis soluções que permitissem criar um ambiente que fosse adaptativo de uma forma simples e interessante, não só para os programadores mas também para qualquer pessoa que fosse responsável por editar o diálogo e a história do jogo. Os resultados foram bastante interessantes, revelando um sistema que depende simultaneamente dos ficheiros de onde é retirado o diálogo, e de um sistema de personalidades que permite definir qual será o comportamento de qualquer objecto do jogo ou, pelo menos, como as outras personagens irão reagir. O produto final é uma ferramenta de bases sólidas que permite uma implementação relativamente simples de um sistema abrangente e adaptativo, com poucas falhas e apenas algumas questões de simplicidade de código.