921 resultados para pacs: C6170K knowledge engineering techniques
Resumo:
The core concepts, or threads, of Biosystems Engineering (BSEN) are variously understood by those within the discipline, but have never been unequivocally defined due to its early stage of development. This makes communication and teaching difficult compared to other well established engineering subjects. Biosystems Engineering is a field of Engineering which int egrates engineering science and design with applied biological, environmental and agricultural sciences. It represents an evolution of the Agricultural Engineering discipline applied to all living organisms not including biomedical applications. The basic key element for the emerging EU Biosystems Engineering program of studies is to ensure that it offers essential minimum fundamental engine ering knowledge and competences . A core curriculum developed by Erasmus Thematic Networks is used as benchmark for Agr icultural and Biosystems Engineering studies in Europe. The common basis of the core curriculum for the discipline across the Atlantic , including a minimum of competences comprising the Biosystems Engineering core competencies, has been defined by an Atlan tis project , but this needs to be taken further by defining the threads linking courses together. This paper presents a structured approach to define the Threads of BSEN . The definition of the mid-level competences and the associated learning outcomes has been one of the objectives of the Atlantis programme TABE.NET. The mid-level competences and learning outcomes for each of six specializations of BSEN are defined while the domain-specific knowledge to be acquired for each outcome is proposed. Once the proposed definitions are adopted, these threads will be available for global development of the BSEN.
Resumo:
There is no empirical evidence whatsoever to support most of the beliefs on which software construction is based. We do not yet know the adequacy, limits, qualities, costs and risks of the technologies used to develop software. Experimentation helps to check and convert beliefs and opinions into facts. This research is concerned with the replication area. Replication is a key component for gathering empirical evidence on software development that can be used in industry to build better software more efficiently. Replication has not been an easy thing to do in software engineering (SE) because the experimental paradigm applied to software development is still immature. Nowadays, a replication is executed mostly using a traditional replication package. But traditional replication packages do not appear, for some reason, to have been as effective as expected for transferring information among researchers in SE experimentation. The trouble spot appears to be the replication setup, caused by version management problems with materials, instruments, documents, etc. This has proved to be an obstacle to obtaining enough details about the experiment to be able to reproduce it as exactly as possible. We address the problem of information exchange among experimenters by developing a schema to characterize replications. We will adapt configuration management and product line ideas to support the experimentation process. This will enable researchers to make systematic decisions based on explicit knowledge rather than assumptions about replications. This research will output a replication support web environment. This environment will not only archive but also manage experimental materials flexibly enough to allow both similar and differentiated replications with massive experimental data storage. The platform should be accessible to several research groups working together on the same families of experiments.
Resumo:
Autonomous systems refer to systems capable of operating in a real world environment without any form of external control for extended periods of time. Autonomy is a desired goal for every system as it improves its performance, safety and profit. Ontologies are a way to conceptualize the knowledge of a specific domain. In this paper an ontology for the description of autonomous systems as well as for its development (engineering) is presented and applied to a process. This ontology is intended to be applied and used to generate final applications following a model driven methodology.
Resumo:
The main objective of this article is to focus on the analysis of teaching techniques, ranging from the use of the blackboard and chalk in old traditional classes, using slides and overhead projectors in the eighties and use of presentation software in the nineties, to the video, electronic board and network resources nowadays. Furthermore, all the aforementioned, is viewed under the different mentalities in which the teacher conditions the student using the new teaching technique, improving soft skills but maybe leading either to encouragement or disinterest, and including the lack of educational knowledge consolidation at scientific, technology and specific levels. In the same way, we study the process of adaptation required for teachers, the differences in the processes of information transfer and education towards the student, and even the existence of teachers who are not any longer appealed by their work due which has become much simpler due to new technologies and the greater ease in the development of classes due to the criteria described on the new Grade Programs adopted by the European Higher Education Area. Moreover, it is also intended to understand the evolution of students’ profiles, from the eighties to present time, in order to understand certain attitudes, behaviours, accomplishments and acknowledgements acquired over the semesters within the degree Programs. As an Educational Innovation Group, another key question also arises. What will be the learning techniques in the future?. How these evolving matters will affect both positively and negatively on the mentality, attitude, behaviour, learning, achievement of goals and satisfaction levels of all elements involved in universities’ education? Clearly, this evolution from chalk to the electronic board, the three-dimensional view of our works and their sequence, greatly facilitates the understanding and adaptation later on to the business world, but does not answer to the unknowns regarding the knowledge and the full development of achievement’s indicators in basic skills of a degree. This is the underlying question which steers the roots of the presented research.
Resumo:
The engineer must have sufficient theoretical knowledge to be applied to solve specific problems, with the necessary capacity to simplify these approaches, and taking into account factors such as speed, simplicity, quality and economy. In Geology, its ultimate goal is the exploration of the history of the geological events through observation, deduction, reasoning and, in exceptional cases by the direct underground exploration or experimentation. Experimentation is very limited in Geology. Reproduction laboratory of certain phenomena or geological processes is difficult because both time and space become a large scale. For this reason, some Earth Sciences are in a nearly descriptive stage whereas others closest to the experimental, Geophysics and Geochemistry, have assimilated progress experienced by the physics and chemistry. Thus, Anglo-Saxon countries clearly separate Engineering Geology from Geological Engineering, i.e. Applied Geology to the Geological Engineering concepts. Although there is a big professional overlap, the first one corresponds to scientific approach, while the last one corresponds to a technological one. Applied Geology to Engineering could be defined as the Science and Applied Geology to the design, construction and performance of engineering infrastructures in and field geology discipline. There has been much discussion on the primacy of theory over practice. Today prevails the exaggeration of practice, but you get good workers and routine and mediocre teachers. This idea forgets too that teaching problem is a problem of right balance. The approach of the action lines on the European Higher Education Area (EHEA) framework provides for such balance. Applied Geology subject represents the first real contact with the physical environment with the practice profession and works. Besides, the situation of the topic in the first trace of Study Plans for many students implies the link to other subjects and topics of the career (tunnels, dams, groundwater, roads, etc). This work analyses in depth the justification of such practical trips. It shows the criteria and methods of planning and the result which manifests itself in pupils. Once practical trips experience developed, the objective work tries to know about results and changes on student’s motivation in learning perspective. This is done regardless of the outcome of their knowledge achievements assessed properly and they are not subject to such work. For this objective, it has been designed a survey about their motivation before and after trip. Survey was made by the Unidad Docente de Geología Aplicada of the Departamento de Ingeniería y Morfología del Terreno (Escuela Técnica Superior de Ingenieros de Caminos, Canales y Puertos, Universidad Politécnica de Madrid). It was completely anonymous. Its objective was to collect the opinion of the student as a key agent of learning and teaching of the subject. All the work takes place under new teaching/learning criteria approach at the European framework in Higher Education. The results are exceptionally good with 90% of student’s participation and with very high scores in a number of questions as the itineraries, teachers and visited places (range of 4.5 to 4.2 in a 5 points scale). The majority of students are very satisfied (average of 4.5 in a 5 points scale).
Resumo:
Survey Engineering curricula involves the integration of many formal disciplines at a high level of proficiency. The Escuela de Ingenieros en Topografía, Cartografía y Geodesia at Universidad Politécnica de Madrid (Survey Engineering) has developed an intense and deep teaching on so-called Applied Land Sciences and Technologies or Land Engineering. However, new approaches are encouraged by the European Higher Education Area (EHEA). This fact requires a review of traditional teaching and methods. Furthermore, the new globalization and international approach gives new ways to this discipline to teach and learn about how to bridge gap between cultures and regions. This work is based in two main needs. On one hand, it is based on integration of basic knowledge and disciplines involved in typical Survey Engineering within Land Management. On the other, there is an urgent need to consider territory on a social and ethical basis, as far as a part of the society, culture, idiosyncrasy or economy. The integration of appropriate knowledge of the Land Management is typically dominated by civil engineers and urban planners. It would be very possible to integrate Survey Engineering and Cooperation for Development in the framework of Land Management disciplines. Cooperation for Development is a concept that has changed since beginning of its use until now. Development projects leave an impact on society in response to their beneficiaries and are directed towards self-sustainability. Furthermore, it is the true bridge to reduce gap between societies when differences are immeasurable. The concept of development has also been changing and nowadays it is not a purely economic concept. Education, science and technology are increasingly taking a larger role in what is meant by development. Moreover, it is commonly accepted that Universities should transfer knowledge to society, and the transfer of knowledge should be open to countries most in need for developing. If the importance of the country development is given by education, science and technology, knowledge transfer would be one of the most clear of ways of Cooperation for Development. Therefore, university cooperation is one of the most powerful tools to achieve it, placing universities as agents of development. In Spain, the role of universities as agents of development and cooperation has been largely strengthened. All about this work deals to how to implement both Cooperation for Development and Land Management within Survey Engineering at the EHEA framework.
Resumo:
In order to manage properly ontology development projects in complex settings and to apply correctly the NeOn Methodology, it is crucial to have knowledge of the entire ontology development life cycle before starting the development projects. The ontology project plan and scheduling helps the ontology development team to have this knowledge and to monitor the project execution. To facilitate the planning and scheduling of ontology development projects, the NeOn Toolkit plugin called gOntt has been developed. gOntt is a tool that supports the scheduling of ontology network development projects and helps to execute them. In addition, prescriptive methodological guidelines for scheduling ontology development projects using gOntt are provided.
Resumo:
During the last century many researches on the business, marketing and technology fields have developed the innovation research line and large amount of knowledge can be found in the literature. Currently, the importance of systematic and openness approaches to manage the available innovation sources is well established in many knowledge fields. Also in the software engineering sector, where the organizations need to absorb and to exploit as much innovative ideas as possible to get success in the current competitive environment. This Master Thesis presents an study related with the innovation sources in the software engineering eld. The main research goals of this work are the identication and the relevance assessment of the available innovation sources and the understanding of the trends on the innovation sources usage. Firstly, a general review of the literature have been conducted in order to define the research area and to identify research gaps. Secondly, the Systematic Literature Review (SLR) has been proposed as the research method in this work to report reliable conclusions collecting systematically quality evidences about the innovation sources in software engineering field. This contribution provides resources, built-on empirical studies included in the SLR, to support a systematic identication and an adequate exploitation of the innovation sources most suitable in the software engineering field. Several artefacts such as lists, taxonomies and relevance assessments of the innovation sources most suitable for software engineering have been built, and their usage trends in the last decades and their particularities on some countries and knowledge fields, especially on the software engineering, have been researched. This work can facilitate to researchers, managers and practitioners of innovative software organizations the systematization of critical activities on innovation processes like the identication and exploitation of the most suitable opportunities. Innovation researchers can use the results of this work to conduct research studies involving the innovation sources research area. Whereas, organization managers and software practitioners can use the provided outcomes in a systematic way to improve their innovation capability, increasing consequently the value creation in the processes that they run to provide products and services useful to their environment. In summary, this Master Thesis research the innovation sources in the software engineering field, providing useful resources to support an effective innovation sources management. Moreover, several aspects should be deeply study to increase the accuracy of the presented results and to obtain more resources built-on empirical knowledge. It can be supported by the INno- vation SOurces MAnagement (InSoMa) framework, which is introduced in this work in order to encourage openness and systematic approaches to identify and to exploit the innovation sources in the software engineering field.
Resumo:
El diseño y desarrollo de sistemas de suspensión para vehículos se basa cada día más en el diseño por ordenador y en herramientas de análisis por ordenador, las cuales permiten anticipar problemas y resolverlos por adelantado. El comportamiento y las características dinámicas se calculan con precisión, bajo coste, y recursos y tiempos de cálculo reducidos. Sin embargo, existe una componente iterativa en el proceso, que requiere la definición manual de diseños a través de técnicas “prueba y error”. Esta Tesis da un paso hacia el desarrollo de un entorno de simulación eficiente capaz de simular, analizar y evaluar diseños de suspensiones vehiculares, y de mejorarlos hacia la solución optima mediante la modificación de los parámetros de diseño. La modelización mediante sistemas multicuerpo se utiliza aquí para desarrollar un modelo de autocar con 18 grados de libertad, de manera detallada y eficiente. La geometría y demás características de la suspensión se ajustan a las del vehículo real, así como los demás parámetros del modelo. Para simular la dinámica vehicular, se utiliza una formulación multicuerpo moderna y eficiente basada en las ecuaciones de Maggi, a la que se ha incorporado un visor 3D. Así, se consigue simular maniobras vehiculares en tiempos inferiores al tiempo real. Una vez que la dinámica está disponible, los análisis de sensibilidad son cruciales para una optimización robusta y eficiente. Para ello, se presenta una técnica matemática que permite derivar las variables dinámicas dentro de la formulación, de forma algorítmica, general, con la precisión de la maquina, y razonablemente eficiente: la diferenciación automática. Este método propaga las derivadas con respecto a las variables de diseño a través del código informático y con poca intervención del usuario. En contraste con otros enfoques en la bibliografía, generalmente particulares y limitados, se realiza una comparación de librerías, se desarrolla una formulación híbrida directa-automática para el cálculo de sensibilidades, y se presentan varios ejemplos reales. Finalmente, se lleva a cabo la optimización de la respuesta dinámica del vehículo citado. Se analizan cuatro tipos distintos de optimización: identificación de parámetros, optimización de la maniobrabilidad, optimización del confort y optimización multi-objetivo, todos ellos aplicados al diseño del autocar. Además de resultados analíticos y gráficos, se incluyen algunas consideraciones acerca de la eficiencia. En resumen, se mejora el comportamiento dinámico de vehículos por medio de modelos multicuerpo y de técnicas de diferenciación automática y optimización avanzadas, posibilitando un ajuste automático, preciso y eficiente de los parámetros de diseño. ABSTRACT Each day, the design and development of vehicle suspension systems relies more on computer-aided design and computer-aided engineering tools, which allow anticipating the problems and solving them ahead of time. Dynamic behavior and characteristics are thus simulated accurately and inexpensively with moderate computational times and resources. There is, however, an iterative component in the process, which involves the manual definition of designs in a trialand-error manner. This Thesis takes a step towards the development of an efficient simulation framework capable of simulating, analyzing and evaluating vehicle suspension designs, and automatically improving them by varying the design parameters towards the optimal solution. The multibody systems approach is hereby used to model a three-dimensional 18-degrees-of-freedom coach in a comprehensive yet efficient way. The suspension geometry and characteristics resemble the ones from the real vehicle, as do the rest of vehicle parameters. In order to simulate vehicle dynamics, an efficient, state-of-the-art multibody formulation based on Maggi’s equations is employed, and a three-dimensional graphics viewer is developed. As a result, vehicle maneuvers can be simulated faster than real-time. Once the dynamics are ready, a sensitivity analysis is crucial for a robust optimization. To that end, a mathematical technique is introduced, which allows differentiating the dynamic variables within the multibody formulation in a general, algorithmic, accurate to machine precision, and reasonably efficient way: automatic differentiation. This method propagates the derivatives with respect to the design parameters throughout the computer code, with little user interaction. In contrast with other attempts in the literature, mostly not generalpurpose, a benchmarking of libraries is carried out, a hybrid direct-automatic differentiation approach for the computation of sensitivities is developed, and several real-life examples are analyzed. Finally, a design optimization process of the aforementioned vehicle is carried out. Four different types of dynamic response optimization are presented: parameter identification, handling optimization, ride comfort optimization and multi-objective optimization; all of which are applied to the design of the coach example. Together with analytical and visual proof of the results, efficiency considerations are made. In summary, the dynamic behavior of vehicles is improved by using the multibody systems approach, along with advanced differentiation and optimization techniques, enabling an automatic, accurate and efficient tuning of design parameters.
Resumo:
Estamos viviendo la era de la Internetificación. A día de hoy, las conexiones a Internet se asumen presentes en nuestro entorno como una necesidad más. La Web, se ha convertido en un lugar de generación de contenido por los usuarios. Una información generada, que sobrepasa la idea con la que surgió esta, ya que en la mayoría de casos, su contenido no se ha diseñado más que para ser consumido por humanos, y no por máquinas. Esto supone un cambio de mentalidad en la forma en que diseñamos sistemas capaces de soportar una carga computacional y de almacenamiento que crece sin un fin aparente. Al mismo tiempo, vivimos un momento de crisis de la educación superior: los altos costes de una educación de calidad suponen una amenaza para el mundo académico. Mediante el uso de la tecnología, se puede lograr un incremento de la productividad, y una reducción en dichos costes en un campo, en el que apenas se ha avanzado desde el Renacimiento. En CloudRoom se ha diseñado una plataforma MOOC con una arquitectura ajustada a las últimas convenciones en Cloud Computing, que implica el uso de Servicios REST, bases de datos NoSQL, y que hace uso de las últimas recomendaciones del W3C en materia de desarrollo web y Linked Data. Para su construcción, se ha hecho uso de métodos ágiles de Ingeniería del Software, técnicas de Interacción Persona-Ordenador, y tecnologías de última generación como Neo4j, Redis, Node.js, AngularJS, Bootstrap, HTML5, CSS3 o Amazon Web Services. Se ha realizado un trabajo integral de Ingeniería Informática, combinando prácticamente la totalidad de aquellas áreas de conocimiento fundamentales en Informática. En definitiva se han ideado las bases de un sistema distribuido robusto, mantenible, con características sociales y semánticas, que puede ser ejecutado en múltiples dispositivos, y que es capaz de responder ante millones de usuarios. We are living through an age of Internetification. Nowadays, Internet connections are a utility whose presence one can simply assume. The web has become a place of generation of content by users. The information generated surpasses the notion with which the World Wide Web emerged because, in most cases, this content has been designed to be consumed by humans and not by machines. This fact implies a change of mindset in the way that we design systems; these systems should be able to support a computational and storage capacity that apparently grows endlessly. At the same time, our education system is in a state of crisis: the high costs of high-quality education threaten the academic world. With the use of technology, we could achieve an increase of productivity and quality, and a reduction of these costs in this field, which has remained largely unchanged since the Renaissance. In CloudRoom, a MOOC platform has been designed with an architecture that satisfies the last conventions on Cloud Computing; which involves the use of REST services, NoSQL databases, and uses the last recommendations from W3C in terms of web development and Linked Data. For its building process, agile methods of Software Engineering, Human-Computer Interaction techniques, and state of the art technologies such as Neo4j, Redis, Node.js, AngularJS, Bootstrap, HTML5, CSS3 or Amazon Web Services have been used. Furthermore, a comprehensive Informatics Engineering work has been performed, by combining virtually all of the areas of knowledge in Computer Science. Summarizing, the pillars of a robust, maintainable, and distributed system have been devised; a system with social and semantic capabilities, which runs in multiple devices, and scales to millions of users.
Learning and Assessing Competencies: New challenges for Mathematics in Engineering Degrees in Spain.
Resumo:
The introduction of new degrees adapted to the European Area of Higher Education (EAHE) has involved a radically different approach to the curriculum. The new programs are structured around competencies that should be acquired. Considering the competencies, teachers must define and develop learning objectives, design teaching methods and establish appropriate evaluation systems. While most Spanish universities have incorporated methodological innovations and evaluation systems different from traditional exams, there is enough confusion about how to teach and assess competencies and learning outcomes, as traditionally the teaching and assessment have focused on knowledge. In this paper we analyze the state-of-the-art in the mathematical courses of the new engineering degrees in some Spanish universities.
Resumo:
Mealiness is a sensory attribute that cannot be defined by a single parameter but through a combination of variables (multidimensional structure). Previous studies propose the definition of mealiness as the lack of crispiness, of hardness and of juiciness. Current aims are focused on establishing non destructive tests for mealiness assessment. MultiSliceMultiEcho Magnetic resonance images (MRI, 64*64pixels) have been taken corresponding to a 3ms of Echo time. Small samples of Top Red apples stored 6 months at controlled atmosphere (expected to be non mealy) and 2°C (expected to be mealy) have been used for MRI imaging. Three out of four apples corresponding to the sample maintained at controlled atmosphere did not develop mealiness while three out of four fruits corresponding to the sample stored at 2°C became mealy after 6 month of storage. The minimum T2 values/image obtained for the mealy apples shows to be significantly lower when compared with non mealy apples pointing that a more dis-aggregated structure leads to a quicker loss of signal Also, there is a significant linear correlation (r=-0.76) between the number of pixels with a T2 value below 35ms within a fruit image and the deformation parameter registered during the Magness-Taylor firmness test. Finally, all the T2 images of the mealy apples show a regional variation of contrast which is not shown for non mealy apples. This variation of contrast is similar to the MRI images of water-cored apples indicating that in these cases there is a differential water movement that may precede the internal browning.
Resumo:
In the last decades, neuropsychological theories tend to consider cognitive functions as a result of the whole brainwork and not as individual local areas of its cortex. Studies based on neuroimaging techniques have increased in the last years, promoting an exponential growth of the body of knowledge about relations between cognitive functions and brain structures [1]. However, so fast evolution make complicated to integrate them in verifiable theories and, even more, translated in to cognitive rehabilitation. The aim of this research work is to develop a cognitive process-modeling tool. The purpose of this system is, in the first term, to represent multidimensional data, from structural and functional connectivity, neuroimaging, data from lesion studies and derived data from clinical intervention [2][3]. This will allow to identify consolidated knowledge, hypothesis, experimental designs, new data from ongoing studies and emerging results from clinical interventions. In the second term, we pursuit to use Artificial Intelligence to assist in decision making allowing to advance towards evidence based and personalized treatments in cognitive rehabilitation. This work presents the knowledge base design of the knowledge representation tool. It is compound of two different taxonomies (structure and function) and a set of tags linking both taxonomies at different levels of structural and functional organization. The remainder of the abstract is organized as follows: Section 2 presents the web application used for gathering necessary information for generating the knowledge base, Section 3 describes knowledge base structure and finally Section 4 expounds reached conclusions.
Resumo:
The study of the effectiveness of the cognitive rehabilitation processes and the identification of cognitive profiles, in order to define comparable populations, is a controversial area, but concurrently it is strongly needed in order to improve therapies. There is limited evidence about cognitive rehabilitation efficacy. Many of the trials conclude that in spite of an apparent clinical good response, differences do not show statistical significance. The common feature in all these trials is heterogeneity among populations. In this situation, observational studies on very well controlled cohort of studies, together with innovative methods in knowledge extraction, could provide methodological insights for the design of more accurate comparative trials. Some correlation studies between neuropsychological tests and patients capacities have been carried out -1---2- and also correlation between tests and morphological changes in the brain -3-. The procedures efficacy depends on three main factors: the affectation profile, the scheduled tasks and the execution results. The relationship between them makes up the cognitive rehabilitation as a discipline, but its structure is not properly defined. In this work we present a clustering method used in Neuro Personal Trainer (NPT) to group patients into cognitive profiles using data mining techniques. The system uses these clusters to personalize treatments, using the patients assigned cluster to select which tasks are more suitable for its concrete needs, by comparing the results obtained in the past by patients with the same profile.
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.