19 resultados para EVOLUTION PROCESS

em Universidad Politécnica de Madrid


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper addresses the modelling and validation of an evolvable hardware architecture which can be mapped on a 2D systolic structure implemented on commercial reconfigurable FPGAs. The adaptation capabilities of the architecture are exercised to validate its evolvability. The underlying proposal is the use of a library of reconfigurable components characterised by their partial bitstreams, which are used by the Evolutionary Algorithm to find a solution to a given task. Evolution of image noise filters is selected as the proof of concept application. Results show that computation speed of the resulting evolved circuit is higher than with the Virtual Reconfigurable Circuits approach, and this can be exploited on the evolution process by using dynamic reconfiguration

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The critical conditions for hydrogenembrittlement (HE) risk of highstrengthgalvanizedsteel (HSGS) wires and tendons exposed to alkaline concrete pore solutions have been evaluated by means of electrochemical and mechanical testing. There is a relationship between the hydrogenembrittlementrisk in HSGS and the length of hydrogen evolution process in alkalinemedia. The galvanizedsteel suffers anodic dissolution simultaneously to the hydrogen evolution which does not stop until the passivation process is completed. HSGS wires exposed to a very highalkalinemedia have showed HE risk with loss in mechanical properties only if long periods with hydrogen evolution process take place with a simultaneous intensive galvanized coating reduction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main objective of this article is to focus on the analysis of teaching techniques, ranging from the use of the blackboard and chalk in old traditional classes, using slides and overhead projectors in the eighties and use of presentation software in the nineties, to the video, electronic board and network resources nowadays. Furthermore, all the aforementioned, is viewed under the different mentalities in which the teacher conditions the student using the new teaching technique, improving soft skills but maybe leading either to encouragement or disinterest, and including the lack of educational knowledge consolidation at scientific, technology and specific levels. In the same way, we study the process of adaptation required for teachers, the differences in the processes of information transfer and education towards the student, and even the existence of teachers who are not any longer appealed by their work due which has become much simpler due to new technologies and the greater ease in the development of classes due to the criteria described on the new Grade Programs adopted by the European Higher Education Area. Moreover, it is also intended to understand the evolution of students’ profiles, from the eighties to present time, in order to understand certain attitudes, behaviours, accomplishments and acknowledgements acquired over the semesters within the degree Programs. As an Educational Innovation Group, another key question also arises. What will be the learning techniques in the future?. How these evolving matters will affect both positively and negatively on the mentality, attitude, behaviour, learning, achievement of goals and satisfaction levels of all elements involved in universities’ education? Clearly, this evolution from chalk to the electronic board, the three-dimensional view of our works and their sequence, greatly facilitates the understanding and adaptation later on to the business world, but does not answer to the unknowns regarding the knowledge and the full development of achievement’s indicators in basic skills of a degree. This is the underlying question which steers the roots of the presented research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Infrared thermography (IRT) is a safe and non-invasive tool used for examining physiological functions based on skin temperature (Tsk) control. The aim of this paper was to establish the probable thermal difference between the beginning and the end of the anterior cruciate ligament (ACL) rehabilitation process after surgery. For this purpose thermograms from 25 ACL surgically operated patients (2 women, 23 men) were taken on the first and last day of a six-week rehabilitation program. A FLIR infrared camera according to the protocol established by the International Academy of Clinical Thermology (IACT). The results showed significant temperature increases in the posterior thigh area between the first and the last week of the rehabilitation process probably due to a compensatory mechanism. According to this, we can conclude that temperature of the posterior area of the injured and non-injured leg has increased from the first to the last day of the rehabilitation process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An important aspect of Process Simulators for photovoltaics is prediction of defect evolution during device fabrication. Over the last twenty years, these tools have accelerated process optimization, and several Process Simulators for iron, a ubiquitous and deleterious impurity in silicon, have been developed. The diversity of these tools can make it difficult to build intuition about the physics governing iron behavior during processing. Thus, in one unified software environment and using self-consistent terminology, we combine and describe three of these Simulators. We vary structural defect distribution and iron precipitation equations to create eight distinct Models, which we then use to simulate different stages of processing. We find that the structural defect distribution influences the final interstitial iron concentration ([Fe-i]) more strongly than the iron precipitation equations. We identify two regimes of iron behavior: (1) diffusivity-limited, in which iron evolution is kinetically limited and bulk [Fe-i] predictions can vary by an order of magnitude or more, and (2) solubility-limited, in which iron evolution is near thermodynamic equilibrium and the Models yield similar results. This rigorous analysis provides new intuition that can inform Process Simulation, material, and process development, and it enables scientists and engineers to choose an appropriate level of Model complexity based on wafer type and quality, processing conditions, and available computation time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Zinc chelates have been widely used to correct deficiencies in this micronutrient in different soil types and under different moisture conditions. The aging of the metal in soil could cause a change in its availability. Over time the most labile forms of Zn could decrease in activity and extractability and change to more stable forms. Various soil parameters, such as redox conditions, time, soil type and moisture conditions, affect the aging process and modify the solubility of the metal. In general, redox conditions influence pH and also the chemical forms dissolved in the soil solution. Soil pH also affects Zn solubility; at high pH values, most of the Zn is present in forms that are not bioavailable to plants. The objective of this study was to determine the changes in Zn over time in a soil solution in a waterlogged acidic soil to which synthetic and natural chelates were applied

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previously degradation studies carried out, over a number of different mortars by the research team, have shown that observed degradation does not exclusively depend on the solution equilibrium pH, nor the aggressive anions relative solubility. In our tests no reason was found that could allow us to explain, why same solubility anions with a lower pH are less aggressive than others. The aim of this paper is to study cement pastes behavior in aggressive environments. As observed in previous research, this cement pastes behaviors are not easily explained only taking into account only usual parameters, pH, solubility etc. Consequently the paper is about studying if solution physicochemical characteristics are more important in certain environments than specific pH values. The paper tries to obtain a degradation model, which starting from solution physicochemical parameters allows us to interpret the different behaviors shown by different composition cements. To that end, the rates of degradation of the solid phases were computed for each considered environment. Three cement have been studied: CEM I 42.5R/SR, CEM II/A-V 42.5R and CEM IV/B-(P-V) 32.5 N. The pastes have been exposed to five environments: sodium acetate/acetic acid 0.35 M, sodium sulfate solution 0.17 M, a solution representing natural water, saturated calcium hydroxide solution and laboratory environment. The attack mechanism was meant to be unidirectional, in order to achieve so; all sides of cylinders were sealed except from the attacked surface. The cylinders were taking out of the exposition environments after 2, 4, 7, 14, 30, 58 and 90 days. Both aggressive solution variations in solid phases and in different depths have been characterized. To each age and depth the calcium, magnesium and iron contents have been analyzed. Hydrated phases evolution studied, using thermal analysis, and crystalline compound changes, using X ray diffraction have been also analyzed. Sodium sulphate and water solutions stabilize an outer pH near to 8 in short time, however the stability of the most pH dependent phases is not the same. Although having similar pH and existing the possibility of forming a plaster layer near to the calcium leaching surface, this stability is greater than other sulphate solutions. Stability variations of solids formed by inverse diffusion, determine the rate of degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Turbulent mixing is a very important issue in the study of geophysical phenomena because most fluxes arising in geophysics fluids are turbulent. We study turbulent mixing due to convection using a laboratory experimental model with two miscible fluids of different density with an initial top heavy density distribution. The fluids that form the initial unstable stratification are miscible and the turbulence will produce molecular mixing. The denser fluid comes into the lighter fluid layer and it generates several forced plumes which are gravitationally unstable. As the turbulent plumes develop, the denser fluid comes into contact with the lighter fluid layer and the mixing process grows. Their development is caused by the lateral interaction between these plumes at the complex fractal surface between the dense and light fluids

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software Product Line Engineering (SPLE) has proved to have significant advantages in family-based software development, but also implies the up¬front design of a product-line architecture (PLA) from which individual product applications can be engineered. The big upfront design associated with PLAs is in conflict with the current need of "being open to change". However, the turbulence of the current business climate makes change inevitable in order to stay competitive, and requires PLAs to be open to change even late in the development. The trend of "being open to change" is manifested in the Agile Software Development (ASD) paradigm, but it is spreading to the domain of SPLE. To reduce the big upfront design of PLAs as currently practiced in SPLE, new paradigms are being created, one being Agile Product Line Engineering (APLE). APLE aims to make the development of product-lines more flexible and adaptable to changes as promoted in ASD. To put APLE into practice it is necessary to make mechanisms available to assist and guide the agile construction and evolution of PLAs while complying with the "be open to change" agile principle. This thesis defines a process for "the agile construction and evolution of product-line architectures", which we refer to as Agile Product-Line Archi-tecting (APLA). The APLA process provides agile architects with a set of models for describing, documenting and tracing PLAs, as well as an algorithm to analyze change impact. Both the models and the change impact analysis offer the following capabilities: Flexibility & adaptability at the time of defining software architectures, enabling change during the incremental and iterative design of PLAs (anticipated or planned changes) and their evolution (unanticipated or unforeseen changes). Assistance in checking architectural integrity through change impact analysis in terms of architectural concerns, such as dependencies on earlier design decisions, rationale, constraints, and risks, etc.Guidance in the change decision-making process through change im¬pact analysis in terms of architectural components and connections. Therefore, APLA provides the mechanisms required to construct and evolve PLAs that can easily be refined iteration after iteration during the APLE development process. These mechanisms are provided in a modeling frame¬work called FPLA. The contributions of this thesis have been validated through the conduction of a project regarding a metering management system in electrical power networks. This case study took place in an i-smart software factory and was in collaboration with the Technical University of Madrid and Indra Software Labs. La Ingeniería de Líneas de Producto Software (Software Product Line Engi¬neering, SPLE) ha demostrado tener ventajas significativas en el desarrollo de software basado en familias de productos. SPLE es un paradigma que se basa en la reutilización sistemática de un conjunto de características comunes que comparten los productos de un mismo dominio o familia, y la personalización masiva a través de una variabilidad bien definida que diferencia unos productos de otros. Este tipo de desarrollo requiere el diseño inicial de una arquitectura de línea de productos (Product-Line Architecture, PLA) a partir de la cual los productos individuales de la familia son diseñados e implementados. La inversión inicial que hay que realizar en el diseño de PLAs entra en conflicto con la necesidad actual de estar continuamente "abierto al cam¬bio", siendo este cambio cada vez más frecuente y radical en la industria software. Para ser competitivos es inevitable adaptarse al cambio, incluso en las últimas etapas del desarrollo de productos software. Esta tendencia se manifiesta de forma especial en el paradigma de Desarrollo Ágil de Software (Agile Software Development, ASD) y se está extendiendo también al ámbito de SPLE. Con el objetivo de reducir la inversión inicial en el diseño de PLAs en la manera en que se plantea en SPLE, en los último años han surgido nuevos enfoques como la Ingeniera de Líneas de Producto Software Ágiles (Agile Product Line Engineering, APLE). APLE propone el desarrollo de líneas de producto de forma más flexible y adaptable a los cambios, iterativa e incremental. Para ello, es necesario disponer de mecanismos que ayuden y guíen a los arquitectos de líneas de producto en el diseño y evolución ágil de PLAs, mientras se cumple con el principio ágil de estar abierto al cambio. Esta tesis define un proceso para la "construcción y evolución ágil de las arquitecturas de lineas de producto software". A este proceso se le ha denominado Agile Product-Line Architecting (APLA). El proceso APLA proporciona a los arquitectos software un conjunto de modelos para de¬scribir, documentar y trazar PLAs, así como un algoritmo para analizar vel impacto del cambio. Los modelos y el análisis del impacto del cambio ofrecen: Flexibilidad y adaptabilidad a la hora de definir las arquitecturas software, facilitando el cambio durante el diseño incremental e iterativo de PLAs (cambios esperados o previstos) y su evolución (cambios no previstos). Asistencia en la verificación de la integridad arquitectónica mediante el análisis de impacto de los cambios en términos de dependencias entre decisiones de diseño, justificación de las decisiones de diseño, limitaciones, riesgos, etc. Orientación en la toma de decisiones derivadas del cambio mediante el análisis de impacto de los cambios en términos de componentes y conexiones. De esta manera, APLA se presenta como una solución para la construcción y evolución de PLAs de forma que puedan ser fácilmente refinadas iteración tras iteración de un ciclo de vida de líneas de producto ágiles. Dicha solución se ha implementado en una herramienta llamada FPLA (Flexible Product-Line Architecture) y ha sido validada mediante su aplicación en un proyecto de desarrollo de un sistema de gestión de medición en redes de energía eléctrica. Dicho proyecto ha sido desarrollado en una fábrica de software global en colaboración con la Universidad Politécnica de Madrid e Indra Software Labs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coupled device and process silumation tools, collectively known as technology computer-aided design (TCAD), have been used in the integrated circuit industry for over 30 years. These tools allow researchers to quickly converge on optimized devide designs and manufacturing processes with minimal experimental expenditures. The PV industry has been slower to adopt these tools, but is quickly developing competency in using them. This paper introduces a predictive defect engineering paradigm and simulation tool, while demonstrating its effectiveness at increasing the performance and throughput of current industrial processes. the impurity-to-efficiency (I2E) simulator is a coupled process and device simulation tool that links wafer material purity, processing parameters and cell desigh to device performance. The tool has been validated with experimental data and used successfully with partners in industry. The simulator has also been deployed in a free web-accessible applet, which is available for use by the industrial and academic communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After more than 40 years of life, software evolution should be considered as a mature field. However, despite such a long history, many research questions still remain open, and controversial studies about the validity of the laws of software evolution are common. During the first part of these 40 years the laws themselves evolved to adapt to changes in both the research and the software industry environments. This process of adaption to new paradigms, standards, and practices stopped about 15 years ago, when the laws were revised for the last time. However, most controversial studies have been raised during this latter period. Based on a systematic and comprehensive literature review, in this paper we describe how and when the laws, and the software evolution field, evolved. We also address the current state of affairs about the validity of the laws, how they are perceived by the research community, and the developments and challenges that are likely to occur in the coming years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ome free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Partido Stream is a small torrential course that flows into the marsh of the Doñana National Park, an area that was declared a World Heritage Site in 1994. Before 1981, floods occurred, and the stream overflowed onto a floodplain. As an old alluvial fan, the floodplain has its singular orography and functionality. Fromthe floodplain, several drainage channels, locally called caño, discharged into themarsh. The Partido Streamhad themorphology of a caño and covered approximately 8 km from the old fan to the marsh. The stream was straightened and channelised in 1981 to cultivate the old fan. This resulted in floods that were concentrated between the banks in the following years, which caused the depth of water and the shear stress to increase, thus, scouring the river bed and river banks. In this case, the eroded materials were carried towards the marsh where a new alluvial fan evolved. Control measures on the old fan were implemented in 2006 to stop the development of the new alluvial fan downstream over the marsh. Thus, the stream would partially recover its original behaviour that it had before channelisation, moving forwards in a new, balanced state. The present study describes the geomorphological evolution that channelisation has caused since 1981 and the later slow process of recovery of the original hydraulic-sedimentation regime since 2006. Additionally, it deepens the understanding of the original hydraulic behaviour of the stream, combining field data and 2D simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The School of Industrial Engineering at Universidad Politécnica de Madrid (ETSII-UPM) has been promoting student-centred teaching-learning activities, according to the aims of the Bologna Declaration, well before the official establishment of the European Area of Higher Education. Such student-centred teaching-learning experiences led us to the conviction that project based learning is rewarding, both for students and academics, and should be additionally promoted in our new engineering programmes, adapted to the Grade-Master structure. The level of commitment of our teachers with these activities is noteworthy, as the teaching innovation experiences carried out in the last ten years have led to the foundation of 17 Teaching Innovation Groups at ETSII-UPM, hence leading the ranking of teaching innovation among all UPM centres. Among interesting CDIO activities our students have taken part in especially complex projects, including the Formula Student, linked to the complete development of a competition car, and the Cybertech competition, aimed at the design, construction and operation of robots for different purposes. Additional project-based learning teamwork activities have been linked to toy design, to the development of medical devices, to the implementation of virtual laboratories, to the design of complete industrial installations and factories, among other activities detailed in present study. The implementation of Bologna process will culminate at ETSII-UPM with the beginning of the Master’s Degree in Industrial Engineering, in academic year 2014-15. The program has been successfully approved by the Spanish Agency for Accreditation (ANECA), with the inclusion of a set of subjects based upon the CDIO methodology denominated generally “INGENIA”, linked to the Spanish “ingeniar” (to provide ingenious solutions), also related etymologically in Spanish with “ingeniero”, engineer. INGENIA students will live through the complete development process of a complex product or system and there will be different kind of projects covering most of the engineering majors at ETSII-UPM.