829 resultados para computer-aided engineering tool
Resumo:
Strokes affect thousands of people worldwide leaving sufferers with severe disabilities affecting their daily activities. In recent years, new rehabilitation techniques have emerged such as constraint-induced therapy, biofeedback therapy and robot-aided therapy. In particular, robotic techniques allow precise recording of movements and application of forces to the affected limb, making it a valuable tool for motor rehabilitation. In addition, robot-aided therapy can utilise visual cues conveyed on a computer screen to convert repetitive movement practice into an engaging task such as a game. Visual cues can also be used to control the information sent to the patient about exercise performance and to potentially address psychosomatic variables influencing therapy. This paper overviews the current state-of-the-art on upper limb robot-mediated therapy with a focal point on the technical requirements of robotic therapy devices leading to the development of upper limb rehabilitation techniques that facilitate reach-to-touch, fine motor control, whole-arm movements and promote rehabilitation beyond hospital stay. The reviewed literature suggest that while there is evidence supporting the use of this technology to reduce functional impairment, besides the technological push, the challenge ahead lies on provision of effective assessment of outcome and modalities that have a stronger impact transferring functional gains into functional independence.
Resumo:
Background: Constructive alignment (CA) is a pedagogical approach that emphasizes the alignment between the intended learning outcomes (ILOs), teaching and learning activities (TLAs) and assessment tasks (ATs) as well as creation of a teaching/learning environment where students will be able to actively create their knowledge. Objectives: This paper aims at investigating the extent of constructively-aligned courses in Computer Engineering and Informatics department at Dalarna University, Sweden. This study is based on empirical observations of teacher’s perceptions of implementation of CA in their courses. Methods: Ten teachers (5 from each department) were asked to fill a paper-based questionnaire, which included a number of questions related to issues of implementing CA in courses. Results: Responses to the items of the questionnaire were mixed. Teachers clearly state the ILOs in their courses and try to align the TLAs and ATs to the ILOs. Computer Engineering teachers do not explicitly communicate the ILOs to the students as compared to Informatics teachers. In addition, Computer Engineering teachers stated that their students are less active in learning activities as compared to Informatics teachers. When asked about their subjective ratings of teaching methods all teachers stated that their current teaching is teacher-centered but they try to shift the focus of activity from them to the students. Conclusions: From teachers’ perspectives, the courses are partially constructively-aligned. Their courses are “aligned”, i.e. ILOs, TLAs and ATs are aligned to each other but they are not “constructive” since, according to them, there was a low student engagement in learning activities, especially in Computer Engineering department.
Resumo:
All over the world, organizations are becoming more and more complex, and there’s a need to capture its complexity, so this is when the DEMO methodology (Design and Engineering Methodology for Organizations), created and developed by Jan L. G. Dietz, reaches its potential, which is to capture the structure of business processes in a coherent and consistent form of diagrams with their respective grammatical rules. The creation of WAMM (Wiki Aided Meta Modeling) platform was the main focus of this thesis, and had like principal precursor the idea to create a Meta-Editor that supports semantic data and uses MediaWiki. This prototype Meta-Editor uses MediaWiki as a receptor of data, and uses the ideas created in the Universal Enterprise Adaptive Object Model and the concept of Semantic Web, to create a platform that suits our needs, through Semantic MediaWiki, which helps the computer interconnect information and people in a more comprehensive, giving meaning to the content of the pages. The proposed Meta-Modeling platform allows the specification of the abstract syntax i.e., the grammar, and concrete syntax, e.g., symbols and connectors, of any language, as well as their model types and diagram types. We use the DEMO language as a proofof-concept and example. All such specifications are done in a coherent and formal way by the creation of semantic wiki pages and semantic properties connecting them.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
New information and communication technologies may be useful for providing more in-depth knowledge to students in many ways, whether through online multimedia educational material, or through online debates with colleagues, teachers and other area professionals in a synchronous or asynchronous manner. This paper focuses on participation in online discussion in e-learning courses for promoting learning. Although an important theoretical aspect, an analysis of literature reveals there are few studies evaluating the personal and social aspects of online course users in a quantitative manner. This paper aims to introduce a method for diagnosing inclusion and digital proficiency and other personal aspects of the student through a case study comparing Information System, Public Relations and Engineering students at a public university in Brazil. Statistical analysis and analysis of variances (ANOVA) were used as the methodology for data analysis in order to understand existing relations between the components of the proposed method. The survey methodology was also used, in its online format, as a research instrument. The method is based on using online questionnaires that diagnose digital proficiency and time management, level of extroversion and social skills of the students. According to the sample studied, there is no strong correlation between digital proficiency and individual characteristics tied to the use of time, level of extroversion and social skills of students. The differences in course grades for some components are partly due to subject 'Introduction to Economics' being offered to freshmen in Public Relations, whereas subject 'Economics in Engineering' is offered in the final semesters of Engineering and Information Systems courses. Therefore, the difference could be more tied to the respondent's age than to the course. Information Systems students were observed to be older, with access to computers and Internet at the workplace, compared to the other students who access the Internet more often from home. This paper presents a pilot study aimed at conducting a diagnosis that permits proposing actions for information and communication technology to contribute towards student education. Three levels of digital inclusion are described as a scale to measure whether information technology increases personal performance and professional knowledge and skills. This study may be useful for other readers interested in themes related to education in engineering. © 2013 IEEE.
Resumo:
This paper deals with transient stability analysis based on time domain simulation on vector processing. This approach requires the solution of a set of differential equations in conjunction of another set of algebraic equations. The solution of the algebraic equations has presented a scalar as sequential set of tasks, and the solution of these equations, on vector computers, has required much more investigations to speedup the simulations. Therefore, the main objective of this paper has been to present methods to solve the algebraic equations using vector processing. The results, using a GRAY computer, have shown that on-line transient stability assessment is feasible.
Resumo:
According to recent research carried out in the foundry sector, one of the most important concerns of the industries is to improve their production planning. A foundry production plan involves two dependent stages: (1) determining the alloys to be merged and (2) determining the lots that will be produced. The purpose of this study is to draw up plans of minimum production cost for the lot-sizing problem for small foundries. As suggested in the literature, the proposed heuristic addresses the problem stages in a hierarchical way. Firstly, the alloys are determined and, subsequently, the items that are produced from them. In this study, a knapsack problem as a tool to determine the items to be produced from furnace loading was proposed. Moreover, we proposed a genetic algorithm to explore some possible sets of alloys and to determine the production planning for a small foundry. Our method attempts to overcome the difficulties in finding good production planning presented by the method proposed in the literature. The computational experiments show that the proposed methods presented better results than the literature. Furthermore, the proposed methods do not need commercial software, which is favorable for small foundries. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
Resumo:
Broad consensus has been reached within the Education and Cognitive Psychology research communities on the need to center the learning process on experimentation and concrete application of knowledge, rather than on a bare transfer of notions. Several advantages arise from this educational approach, ranging from the reinforce of students learning, to the increased opportunity for a student to gain greater insight into the studied topics, up to the possibility for learners to acquire practical skills and long-lasting proficiency. This is especially true in Engineering education, where integrating conceptual knowledge and practical skills assumes a strategic importance. In this scenario, learners are called to play a primary role. They are actively involved in the construction of their own knowledge, instead of passively receiving it. As a result, traditional, teacher-centered learning environments should be replaced by novel learner-centered solutions. Information and Communication Technologies enable the development of innovative solutions that provide suitable answers to the need for the availability of experimentation supports in educational context. Virtual Laboratories, Adaptive Web-Based Educational Systems and Computer-Supported Collaborative Learning environments can significantly foster different learner-centered instructional strategies, offering the opportunity to enhance personalization, individualization and cooperation. More specifically, they allow students to explore different kinds of materials, to access and compare several information sources, to face real or realistic problems and to work on authentic and multi-facet case studies. In addition, they encourage cooperation among peers and provide support through coached and scaffolded activities aimed at fostering reflection and meta-cognitive reasoning. This dissertation will guide readers within this research field, presenting both the theoretical and applicative results of a research aimed at designing an open, flexible, learner-centered virtual lab for supporting students in learning Information Security.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
Software is available, which simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. These dynamic models are based upon equations derived from the transport concepts such as electromigration, diffusion, electroosmosis and imposed hydrodynamic buffer flow that are applied to user-specified initial distributions of analytes and electrolytes. They are able to predict the evolution of electrolyte systems together with associated properties such as pH and conductivity profiles and are as such the most versatile tool to explore the fundamentals of electrokinetic separations and analyses. In addition to revealing the detailed mechanisms of fundamental phenomena that occur in electrophoretic separations, dynamic simulations are useful for educational purposes. This review includes a list of current high-resolution simulators, information on how a simulation is performed, simulation examples for zone electrophoresis, ITP, IEF and EKC and a comprehensive discussion of the applications and achievements.