960 resultados para Computer Engineering|Electrical engineering
Resumo:
Practical sessions are the backbone of qualification in engineering education. It leads to a better understanding and allows mastering scientific concepts and theories. The lack of the availability of practical sessions at many universities and institutions owing to the cost and the unavailability of instructors the most of the time caused a significant decline in experimentation in engineering education over the last decades. Recently, with the progress of computer-based learning, remote laboratories have been proven to be the best alternative to the traditional ones, regarding to its low cost and ubiquity. Some universities have already started to deploy remote labs in their practical sessions. This contribution compiles diverse experiences based on the deployment of the remote laboratory, Virtual Instrument Systems in Reality (VISIR), on the practices of undergraduate engineering grades at various universities within the VISIR community. It aims to show the impact of its usage on engineering education concerning the assessments of students and teachers as well.
Resumo:
To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.
Resumo:
Software tools in education became popular since the widespread of personal computers. Engineering courses lead the way in this development and these tools became almost a standard. Engineering graduates are familiar with numerical analysis tools but also with simulators (e.g. electronic circuits), computer assisted design tools and others, depending on the degree. One of the main problems with these tools is when and how to start use them so that they can be beneficial to students and not mere substitutes for potentially difficult calculations or design. In this paper a software tool to be used by first year students in electronics/electricity courses is presented. The growing acknowledgement and acceptance of open source software lead to the choice of an open source software tool – Scilab, which is a numerical analysis tool – to develop a toolbox. The toolbox was developed to be used as standalone or integrated in an e-learning platform. The e-learning platform used was Moodle. The first approach was to assess the mathematical skills necessary to solve all the problems related to electronics and electricity courses. Analysing the existing circuit simulators software tools, it is clear that even though they are very helpful by showing the end result they are not so effective in the process of the students studying and self learning since they show results but not intermediate steps which are crucial in problems that involve derivatives or integrals. Also, they are not very effective in obtaining graphical results that could be used to elaborate reports and for an overall better comprehension of the results. The developed tool was based on the numerical analysis software Scilab and is a toolbox that gives their users the opportunity to obtain the end results of a circuit analysis but also the expressions obtained when derivative and integrals calculations, plot signals, obtain vector diagrams, etc. The toolbox runs entirely in the Moodle web platform and provides the same results as the standalone application. The students can use the toolbox through the web platform (in computers where they don't have installation privileges) or in their personal computers by installing both the Scilab software and the toolbox. This approach was designed for first year students from all engineering degrees that have electronics/electricity courses in their curricula.
Resumo:
Massive Open Online Courses (MOOC) are gaining prominence in transversal teaching-learning strategies. However, there are many issues still debated, namely assessment, recognized largely as a cornerstone in Education. The large number of students involved requires a redefinition of strategies that often use approaches based on tasks or challenging projects. In these conditions and due to this approach, assessment is made through peer-reviewed assignments and quizzes online. The peer-reviewed assignments are often based upon sample answers or topics, which guide the student in the task of evaluating peers. This chapter analyzes the grading and evaluation in MOOCs, especially in science and engineering courses, within the context of education and grading methodologies and discusses possible perspectives to pursue grading quality in massive e-learning courses.
Resumo:
Dissertação para obtenção do Grau de Doutor em Ciências da Educação
Resumo:
In recent years, new methods of clean and environmentally friendly energy production have been the focus of intense research efforts. Microbial fuel cells (MFCs) are devices that utilize naturally occurring microorganisms that feed on organic matter, like waste water, while producing electrical energy. The natural habitats of bacteria thriving in microbial fuel cells are usually marine and freshwater sediments. These microorganisms are called dissimilatory metal reducing bacteria (DMRB), but in addition to metals like iron and manganese, they can use organic compounds like DMSO or TMAO, radionuclides and electrodes as terminal electron acceptors in their metabolic pathways.(...)
Resumo:
Tissue engineering often rely on scaffolds for supporting cell differentiation and growth. Novel paradigms for tissue engineering include the need of active or smart scaffolds in order to properly regenerate specific tissues. In particular, as electrical and electromechanical clues are among the most relevant ones in determining tissue functionality in tissues such as muscle and bone, among others, electroactive materials and, in particular, piezoelectric ones, show strong potential for novel tissue engineering strategies, in particular taking also into account the existence of these phenomena within some specific tissues, indicating their requirement also during tissue regeneration. This referee reports on piezoelectric materials used for tissue engineering applications. The most used materials for tissue engineering strategies are reported together with the main achievements, challenges and future needs for research and actual therapies. This review provides thus a compilation of the most relevant results and strategies and a start point for novel research pathways in the most relevant and challenging open questions.
Piezoelectric poly(vinylidene fluoride) microstructure and poling state in active tissue engineering
Resumo:
Tissue engineering often rely on scaffolds for supporting cell differentiation and growth. Novel paradigms for tissue engineering include the need of active or smart scaffolds in order to properly regenerate specific tissues. In particular, as electrical and electromechanical clues are among the most relevant ones in determining tissue functionality in tissues such as muscle and bone, among others, electroactive materials and, in particular, piezoelectric ones, show strong potential for novel tissue engineering strategies, in particular taking also into account the existence of these phenomena within some specific tissues, indicating their requirement also during tissue regeneration. This referee reports on piezoelectric materials used for tissue engineering applications. The most used materials for tissue engineering strategies are reported together with the main achievements, challenges and future needs for research and actual therapies. This review provides thus a compilation of the most relevant results and strategies and a start point for novel research pathways in the most relevant and challenging open questions.
Resumo:
A novel approach for tissue engineering applications based on the use of magnetoelectric materials is presented. This work proves that magnetoelectric Terfenol-D/poly(vinylidene fluoride-co-trifluoroethylene) composites are able to provide mechanical and electrical stimuli to MC3T3-E1 pre-osteoblast cells and that those stimuli can be remotely triggered by an applied magnetic field. Cell proliferation is enhanced up to 25% when cells are cultured under mechanical (up to 110 ppm) and electrical stimulation (up to 0.115 mV), showing that magnetoelectric cell stimulation is a novel and suitable approach for tissue engineering allowing magnetic, mechanical and electrical stimuli.
Resumo:
Bone defects in revision knee arthroplasty are often located in load-bearing regions. The goal of this study was to determine whether a physiologic load could be used as an in situ osteogenic signal to the scaffolds filling the bone defects. In order to answer this question, we proposed a novel translation procedure having four steps: (1) determining the mechanical stimulus using finite element method, (2) designing an animal study to measure bone formation spatially and temporally using micro-CT imaging in the scaffold subjected to the estimated mechanical stimulus, (3) identifying bone formation parameters for the loaded and non-loaded cases appearing in a recently developed mathematical model for bone formation in the scaffold and (4) estimating the stiffness and the bone formation in the bone-scaffold construct. With this procedure, we estimated that after 3 years mechanical stimulation increases the bone volume fraction and the stiffness of scaffold by 1.5- and 2.7-fold, respectively, compared to a non-loaded situation.
Resumo:
Fault location has been studied deeply for transmission lines due to its importance in power systems. Nowadays the problem of fault location on distribution systems is receiving special attention mainly because of the power quality regulations. In this context, this paper presents an application software developed in Matlabtrade that automatically calculates the location of a fault in a distribution power system, starting from voltages and currents measured at the line terminal and the model of the distribution power system data. The application is based on a N-ary tree structure, which is suitable to be used in this application due to the highly branched and the non- homogeneity nature of the distribution systems, and has been developed for single-phase, two-phase, two-phase-to-ground, and three-phase faults. The implemented application is tested by using fault data in a real electrical distribution power system
Resumo:
BACKGROUND: Living in a multisensory world entails the continuous sensory processing of environmental information in order to enact appropriate motor routines. The interaction between our body and our brain is the crucial factor for achieving such sensorimotor integration ability. Several clinical conditions dramatically affect the constant body-brain exchange, but the latest developments in biomedical engineering provide promising solutions for overcoming this communication breakdown. NEW METHOD: The ultimate technological developments succeeded in transforming neuronal electrical activity into computational input for robotic devices, giving birth to the era of the so-called brain-machine interfaces. Combining rehabilitation robotics and experimental neuroscience the rise of brain-machine interfaces into clinical protocols provided the technological solution for bypassing the neural disconnection and restore sensorimotor function. RESULTS: Based on these advances, the recovery of sensorimotor functionality is progressively becoming a concrete reality. However, despite the success of several recent techniques, some open issues still need to be addressed. COMPARISON WITH EXISTING METHOD(S): Typical interventions for sensorimotor deficits include pharmaceutical treatments and manual/robotic assistance in passive movements. These procedures achieve symptoms relief but their applicability to more severe disconnection pathologies is limited (e.g. spinal cord injury or amputation). CONCLUSIONS: Here we review how state-of-the-art solutions in biomedical engineering are continuously increasing expectances in sensorimotor rehabilitation, as well as the current challenges especially with regards to the translation of the signals from brain-machine interfaces into sensory feedback and the incorporation of brain-machine interfaces into daily activities.
Resumo:
This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.
Resumo:
Global energy consumption has been increasing yearly and a big portion of it is used in rotating electrical machineries. It is clear that in these machines energy should be used efficiently. In this dissertation the aim is to improve the design process of high-speed electrical machines especially from the mechanical engineering perspective in order to achieve more reliable and efficient machines. The design process of high-speed machines is challenging due to high demands and several interactions between different engineering disciplines such as mechanical, electrical and energy engineering. A multidisciplinary design flow chart for a specific type of high-speed machine in which computer simulation is utilized is proposed. In addition to utilizing simulation parallel with the design process, two simulation studies are presented. The first is used to find the limits of two ball bearing models. The second is used to study the improvement of machine load capacity in a compressor application to exceed the limits of current machinery. The proposed flow chart and simulation studies show clearly that improvements in the high-speed machinery design process can be achieved. Engineers designing in high-speed machines can utilize the flow chart and simulation results as a guideline during the design phase to achieve more reliable and efficient machines that use energy efficiently in required different operation conditions.
Resumo:
La fibrillation auriculaire, l'arythmie la plus fréquente en clinique, affecte 2.3 millions de patients en Amérique du Nord. Pour en étudier les mécanismes et les thérapies potentielles, des modèles animaux de fibrillation auriculaire ont été développés. La cartographie électrique épicardique à haute densité est une technique expérimentale bien établie pour suivre in vivo l'activité des oreillettes en réponse à une stimulation électrique, à du remodelage, à des arythmies ou à une modulation du système nerveux autonome. Dans les régions qui ne sont pas accessibles par cartographie épicardique, la cartographie endocardique sans contact réalisée à l'aide d'un cathéter en forme de ballon pourrait apporter une description plus complète de l'activité auriculaire. Dans cette étude, une expérience chez le chien a été conçue et analysée. Une reconstruction électro-anatomique, une cartographie épicardique (103 électrodes), une cartographie endocardique sans contact (2048 électrodes virtuelles calculées à partir un cathéter en forme de ballon avec 64 canaux) et des enregistrements endocardiques avec contact direct ont été réalisés simultanément. Les systèmes d'enregistrement ont été également simulés dans un modèle mathématique d'une oreillette droite de chien. Dans les simulations et les expériences (après la suppression du nœud atrio-ventriculaire), des cartes d'activation ont été calculées pendant le rythme sinusal. La repolarisation a été évaluée en mesurant l'aire sous l'onde T auriculaire (ATa) qui est un marqueur de gradient de repolarisation. Les résultats montrent un coefficient de corrélation épicardique-endocardique de 0.8 (expérience) and 0.96 (simulation) entre les cartes d'activation, et un coefficient de corrélation de 0.57 (expérience) and 0.92 (simulation) entre les valeurs de ATa. La cartographie endocardique sans contact apparait comme un instrument expérimental utile pour extraire de l'information en dehors des régions couvertes par les plaques d'enregistrement épicardique.