10 resultados para Basic ideas
em Instituto Politécnico do Porto, Portugal
Resumo:
This research work has been focused in the study of gallinaceous feathers, a waste that may be valorised as sorbent, to remove the Dark Blue Astrazon 2RN (DBA) from Dystar. This study was focused on the following aspects: optimization of experimental conditions through factorial design methodology, kinetic studies into a continuous stirred tank adsorber (at pH 7 and 20ºC), equilibrium isotherms (at pH 5, 7 and 9 at 20 and 45ºC) and column studies (at 20ºC, at pH 5, 7 and 9). In order to evaluate the influence of the presence of other components in the sorption of the dyestuff, all experiments were performed both for the dyestuff in aqueous solution and in real textile effluent. The pseudo-first and pseudo-second order kinetic models were fitted to the experimental data, being the latter the best fit for the aqueous solution of dyestuff. For the real effluent both models fit the experimental results and there is no statistical difference between them. The Central Composite Design (CCD) was used to evaluate the effects of temperature (15 - 45ºC) and pH (5 - 9) over the sorption in aqueous solution. The influence of pH was more significant than temperature. The optimal conditions selected were 45ºC and pH 9. Both Langmuir and Freundlich models could fit the equilibrium data. In the concentration range studied, the highest sorbent capacity was obtained for the optimal conditions in aqueous solution, which corresponds to a maximum capacity of 47± 4 mg g-1. The Yoon-Nelson, Thomas and Yan’s models fitted well the column experimental data. The highest breakthrough time for 50% removal, 170 min, was obtained at pH 9 in aqueous solution. The presence of the dyeing agents in the real wastewater decreased the sorption of the dyestuff mostly for pH 9, which is the optimal pH. The effect of pH is less pronounced in the real effluent than in aqueous solution. This work shows that feathers can be used as sorbent in the treatment of textile wastewaters containing DBA.
Resumo:
The fractional order calculus (FOC) is as old as the integer one although up to recently its application was exclusively in mathematics. Many real systems are better described with FOC differential equations as it is a well-suited tool to analyze problems of fractal dimension, with long-term “memory” and chaotic behavior. Those characteristics have attracted the engineers' interest in the latter years, and now it is a tool used in almost every area of science. This paper introduces the fundamentals of the FOC and some applications in systems' identification, control, mechatronics, and robotics, where it is a promissory research field.
Resumo:
Genetic Algorithms (GAs) are adaptive heuristic search algorithm based on the evolutionary ideas of natural selection and genetic. The basic concept of GAs is designed to simulate processes in natural system necessary for evolution, specifically those that follow the principles first laid down by Charles Darwin of survival of the fittest. On the other hand, Particle swarm optimization (PSO) is a population based stochastic optimization technique inspired by social behavior of bird flocking or fish schooling. PSO shares many similarities with evolutionary computation techniques such as GAs. The system is initialized with a population of random solutions and searches for optima by updating generations. However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions, called particles, fly through the problem space by following the current optimum particles. PSO is attractive because there are few parameters to adjust. This paper presents hybridization between a GA algorithm and a PSO algorithm (crossing the two algorithms). The resulting algorithm is applied to the synthesis of combinational logic circuits. With this combination is possible to take advantage of the best features of each particular algorithm.
Resumo:
In order to cater for an extended readership, crime fiction, like most popular genres, is based on the repetition of a formula allowing for the reader's immediate identification. This first domestication is followed, at the time of its translation, by a second process, which wipes out those characteristics of the source text that may come into conflict with the dominant values of the target culture. An analysis of the textual and paratextual strategies used in the English translation of José Carlos Somoza's La caverna de las ideas (2000) shows the efforts to make the novel more easily marketable in the English-speaking world through the elimination of most of the obstacles to easy readability.
Resumo:
Speech interfaces for Assistive Technologies are not common and are usually replaced by others. The market they are targeting is not considered attractive and speech technologies are still not well spread. Industry still thinks they present some performance risks, especially Speech Recognition systems. As speech is the most elemental and natural way for communication, it has strong potential for enhancing inclusion and quality of life for broader groups of users with special needs, such as people with cerebral palsy and elderly staying at their homes. This work is a position paper in which the authors argue for the need to make speech become the basic interface in assistive technologies. Among the main arguments, we can state: speech is the easiest way to interact with machines; there is a growing market for embedded speech in assistive technologies, since the number of disabled and elderly people is expanding; speech technology is already mature to be used but needs adaptation to people with special needs; there is still a lot of R&D to be done in this area, especially when thinking about the Portuguese market. The main challenges are presented and future directions are proposed.
Resumo:
Open innovation is a hot topic in innovation management. Its basic premise is open up the innovation process. The innovation process, in general sense, may be seen as the process of designing, developing and commercializing a novel product or service to improve the value added of a company. The development of Web 2.0 tools facilitates this kind of contributions, opening space to the emergence of crowdsourcing innovation initiatives. Crowdsourcing is a form of outsourcing not directed to other companies but to the crowd by means of an open call mostly through an Internet platform. Innovation intermediaries, in general sense, are organizations that work to enable innovation, that just act as brokers or agents between two or more parties. Usually, they are also engaged in other activities like inter-organizational networking and technology development and related activities. A crowdsourcing innovation intermediary is an organization that mediates the communication and relationship between the seekers – companies that aspire to solve some problem or to take advantage of any business opportunity – with a crowd that is prone to give ideas based on their knowledge, experience and wisdom. This paper identifies and analyses the functions to be performed by an intermediary of crowdsourcing innovation through grounded theory analyses from literature. The resulting model is presented and explained. The resulting model summarizes eight main functions that can be performed by a crowdsourcing process, namely, diagnoses, mediation, linking knowledge, community, evaluation, project management, intellectual property governance and marketing and support. These functions are associated with a learning cycle process which covers all the crowdsourcing activities that can be realized by the broker.
Resumo:
As technology advances not only do new standards and programming styles appear but also some of the previously established ones gain relevance. In a new Internet paradigm where interconnection between small devices is key to the development of new businesses and scientific advancement there is the need to find simple solutions that anyone can implement in order to allow ideas to become more than that, ideas. Open-source software is still alive and well, especially in the area of the Internet of Things. This opens windows for many low capital entrepreneurs to experiment with their ideas and actually develop prototypes, which can help identify problems with a project or shine light on possible new features and interactions. As programming becomes more and more popular between people of fields not related to software there is the need for guidance in developing something other than basic algorithms, which is where this thesis comes in: A comprehensive document explaining the challenges and available choices of developing a sensor data and message delivery system, which scales well and implements the delivery of critical messages. Modularity and extensibility were also given much importance, making this an affordable tool for anyone that wants to build a sensor network of the kind.
Resumo:
Manipulator systems are rather complex and highly nonlinear which makes difficult their analysis and control. Classic system theory is veil known, however it is inadequate in the presence of strong nonlinear dynamics. Nonlinear controllers produce good results [1] and work has been done e. g. relating the manipulator nonlinear dynamics with frequency response [2–5]. Nevertheless, given the complexity of the problem, systematic methods which permit to draw conclusions about stability, imperfect modelling effects, compensation requirements, etc. are still lacking. In section 2 we start by analysing the variation of the poles and zeros of the descriptive transfer functions of a robot manipulator in order to motivate the development of more robust (and computationally efficient) control algorithms. Based on this analysis a new multirate controller which is an improvement of the well known “computed torque controller” [6] is announced in section 3. Some research in this area was done by Neuman [7,8] showing tbat better robustness is possible if the basic controller structure is modified. The present study stems from those ideas, and attempts to give a systematic treatment, which results in easy to use standard engineering tools. Finally, in section 4 conclusions are presented.
Resumo:
Relatório EPE - Relatório de estágio em Educação Pré-Escolar: O presente relatório foi realizado no âmbito da unidade curricular Prática Pedagógica Supervisionada, inserida no Mestrado em Educação Pré-Escolar e Ensino do 1.º Ciclo do Ensino Básico, da Escola Superior de Educação do Porto, durante o ano letivo 2013/2014, atribuindo qualificação profissional no contexto de pré-escolar. Este documento pretende descrever e analisar o percurso formativo desenvolvido pela mestranda ao longo da sua prática pedagógica supervisionada, numa perspetiva reflexiva sobre a construção dos saberes profissionais para a Educação. Este percurso formativo comprometeu uma atitude investigativa, bem como a mobilização de saberes científicos e legais por forma a articular as vertentes de teoria e prática, perspetivando uma construção integrada dos saberes. A prática pedagógica foi sustentada numa perspetiva construtivista, atribuindo um papel ativo à criança na construção das suas aprendizagens. Esta baseou-se num trabalho de equipa cooperativo, o qual se concretiza através do debate e da partilha de ideias, entre os vários intervenientes da ação, como meio para a transformação da realidade educativa. Ao longo da prática foi, ainda, atribuído um papel fundamental à observação do contexto, sendo esta essencial para a compreensão e conhecimento pleno da criança. Em suma, todo o percurso formativo no qual está englobado a prática pedagógica, levou a uma problematização das questões emergentes da prática, desenvolvendo uma atitude indagadora e reflexiva. Apesar da significante contribuição para a aquisição de competências pessoais e profissionais, o percurso formativo deve ser assumido como uma construção contínua, fundamentado num princípio da aprendizagem ao longo da vida.