35 resultados para research supervision in engineering and IT
em Universidad Politécnica de Madrid
Resumo:
The integration of scientific knowledge about possible climate change impacts on water resources has a direct implication on the way water policies are being implemented and evolving. This is particularly true regarding various technical steps embedded into the EU Water Framework Directive river basin management planning, such as risk characterisation, monitoring, design and implementation of action programmes and evaluation of the "good status" objective achievements (in 2015). The need to incorporate climate change considerations into the implementation of EU water policy is currently discussed with a wide range of experts and stakeholders at EU level. Research trends are also on-going, striving to support policy developments and examining how scientific findings and recommendations could be best taken on board by policy-makers and water managers within the forthcoming years. This paper provides a snapshot of policy discussions about climate change in the context of the WFD river basin management planning and specific advancements of related EU-funded research projects. Perspectives for strengthening links among the scientific and policy-making communities in this area are also highlighted.
Resumo:
In this work, a comparison between the competences codes in the CDIÓs* curriculum, the ones defined for the Tunning Project and the International Project Management Association (IPMA) is made. The goal is to define the most appropriate competences codes for the engineering education in Latin America. The CDIO code is obtained from the engineering practice, and responds to the Accreditation Board for Engineering and Technology (ABET) standards of accreditation. The Tuning competences are the ones defined for Latin America and the IPMÁs are international competences for project management. It is the first time that the competences defined in ABET accreditation standards in the engineering field are compared with the international competences according to IPMÁs model. The results give evidence that, in first place, there is a need to apply holistic models in the definition of an engineering curriculum. Second, the pertinence of these models in the definition of engineering programs in Latin America.
Resumo:
At the present time almost all map libraries on the Internet are image collections generated by the digitization of early maps. This type of graphics files provides researchers with the possibility of accessing and visualizing historical cartographic information keeping in mind that this information has a degree of quality that depends upon elements such as the accuracy of the digitization process and proprietary constraints (e.g. visualization, resolution downloading options, copyright, use constraints). In most cases, access to these map libraries is useful only as a first approach and it is not possible to use those maps for scientific work due to the sparse tools available to measure, match, analyze and/or combine those resources with different kinds of cartography. This paper presents a method to enrich virtual map rooms and provide historians and other professional with a tool that let them to make the most of libraries in the digital era.
Resumo:
The engineering careers models were diverse in Europe, and are adopting now in Spain the Bolonia process for European Universities. Separated from older Universities, that are in part technically active, Civil Engineering (Caminos, Canales y Puertos) started at end of 18th century in Spain adopting the French models of Upper Schools for state civil servants with exam at entry. After 1800 intense wars, to conserve forest regions Ingenieros de Montes appeared as Upper School, and in 1855 also the Ingenieros Agrónomos to push up related techniques and practices. Other Engineers appeared as Upper Schools but more towards private factories. These ES got all adapted Lower Schools of Ingeniero Tecnico. Recently both grew much in number and evolved, linked also to recognized Professions. Spanish society, into European Community, evolved across year 2000, in part highly well, but with severe discordances, that caused severe youth unemployment with 2008-2011 crisis. With Bolonia process high formal changes step in from 2010-11, accepted with intense adaptation. The Lower Schools are changing towards the Upper Schools, and both that have shifted since 2010-11 various 4-years careers (Grado), some included into the precedent Professions, and diverse Masters. Acceptation of them to get students has started relatively well, and will evolve, and acceptation of new grades for employment in Spain, Europe or outside will be essential. Each Grado has now quite rigid curricula and programs, MOODLE was introduced to connect pupils, some specific uses of Personal Computers are taught in each subject. Escuela de Agronomos centre, reorganized with its old name in its precedent buildings at entrance of Campus Moncloa, offers Grados of Agronomic Engineering and Science for various public and private activities for agriculture, Alimentary Engineering for alimentary activities and control, Agro-Environmental Engineering more related to environment activities, and in part Biotechnology also in laboratories in Campus Monte-Gancedo for Biotechnology of Plants and Computational Biotechnology. Curricula include Basics, Engineering, Practices, Visits, English, ?project of end of career?, Stays. Some masters will conduce to specific professional diploma, list includes now Agro-Engineering, Agro-Forestal Biotechnology, Agro and Natural Resources Economy, Complex Physical Systems, Gardening and Landscaping, Rural Genie, Phytogenetic Resources, Plant Genetic Resources, Environmental Technology for Sustainable Agriculture, Technology for Human Development and Cooperation.
Resumo:
This paper aims to outline a theory-based Content and Language Integrated Learning course and to establish the rationale for adopting a holistic approach to the teaching of languages in tertiary education. Our work focuses on the interdependence between Content and Language Integrated Learning (CLIL), and the use of Information and Communication Technologies (ICT), in particular regarding the learning of English within the framework of Telecommunications Engineering. The study first analyses the diverse components of the instructional approach and the extent to which this approach interrelates with technologies within the context of what we have defined as a holistic experience, since it also aims to develop a set of generic competences or transferable skills. Second, an example of a course project framed in this holistic approach is described in order to exemplify the specific actions suggested for learner autonomy and CLIL. The approach provides both an adequate framework as well as the conditions needed to carry out a lifelong learning experience within our context, a Spanish School of Engineering. In addition to specialized language and content, the approach integrates the learning of skills and capacities required by the new plans that have been established following the Bologna Declaration in 1999.
Resumo:
An online open access test (CREAX self-assessment) has been used in this work so that students from degrees in engineering in the Universidad Polite¿cnica of Madrid (UPM) could self-assess their creative competence after several classroom activities. Different groups from the first year course have been statistically compared using data from their assessment. These first year students had different professors in the subject ?Technical Drawing? and belonged to several degrees in the UPM. They were as well compared regarding sex and a group of first year students was also compared to another last year group of the degree so as to observe possible differences in the achievement of this competence. Only one difference was detected concerning sex in one of the degrees. Among degrees, the higher marks obtained by students who had done specific exercises for the development of creativity in class is highlighted. Finally, a significantly high mark was observed in students during their last year of degree with respect to first year students. The tool CREAX has become very useful in the assessment of this competence in the UPM degrees in which it has been implemented.
Resumo:
The bonding quality of epoxy glued timber and glass fibre reinforced polymers (GFRP) was evaluated by means of compression loading shear test. Three timber species (Radiata pine, Laricio pine and Oak) and two kinds of GFRP (plates and rods made with polyester resin reinforced with mat and roving glass fibre) were glued and tested using three epoxy formulations. The increase in shear strength with age after the setting of epoxy formulations and the effect of surface roughness on timber and GRP gluing (the planing of the surface of timber and the previous sanding of GRP) were studied. It can be concluded that the mechanical properties of these products make them suitable for use in the reinforcement of deteriorated timber structures, and that a rough timber surface is preferable to a planed one, while the previous sanding of GRP surfaces is not advantageous.
Resumo:
This article describes a research project involving students from nine different engineering degrees at the Technical university of Madrid. The purpose of the project was to analyze the use of peer and self assessment and the students? attitudes toward alternative assessment procedures.
Resumo:
The increase of multimedia services delivered over packet-based networks has entailed greater quality expectations of the end-users. This has led to an intensive research on techniques for evaluating the quality of experience perceived by the viewers of audiovisual content, considering the different degradations that it could suffer along the broadcasting system. In this paper, a comprehensive study of the impact of transmission errors affecting video and audio in IPTV is presented. With this aim, subjective assessment tests were carried out proposing a novel methodology trying to keep as close as possible home environment viewing conditions. Also 3DTV content in side-by-side format has been used in the experiments to compare the impact of the degradations. The results provide a better understanding of the effects of transmission errors, and show that the QoE related to the first approach of 3DTV is acceptable, but the visual discomfort that it causes should be reduced.
Resumo:
The objective of this paper is to address the methodological process of a teaching strategy for training project management complexity in postgraduate programs. The proposal is made up of different methods —intuitive, comparative, deductive, case study, problem-solving Project-Based Learning— and different activities inside and outside the classroom. This integration of methods motivated the current use of the concept of ―learning strategy‖. The strategy has two phases: firstly, the integration of the competences —technical, behavioral and contextual—in real projects; and secondly, the learning activity was oriented in upper level of knowledge, the evaluating the complexity for projects management in real situations. Both the competences in the learning strategy and the Project Complexity Evaluation are based on the ICB of IPMA. The learning strategy is applied in an international Postgraduate Program —Erasmus Mundus Master of Science— with the participation of five Universities of the European Union. This master program is fruit of a cooperative experience from one Educative Innovation Group of the UPM -GIE-Project-, two Research Groups of the UPM and the collaboration with other external agents to the university. Some reflections on the experience and the main success factors in the learning strategy were presented in the paper.
Resumo:
The competence evaluation promoted by the European High Education Area entails a very important methodological change that requires guiding support to help teachers carry out this new and complex task. In this regard, the Technical University of Madrid (UPM, by its Spanish acronym) has financed a series of coordinated projects with a two-fold objective: a) To develop a model for teaching and evaluating core competences that is useful and easily applicable to its different degrees, and b) to provide support to teachers by creating an area within the Website for Educational Innovation where they can search for information on the model corresponding to each core competence approved by UPM. Information available on each competence includes its definition, the formulation of indicators providing evidence on the level of acquisition, the recommended teaching and evaluation methodology, examples of evaluation rules for the different levels of competence acquisition, and descriptions of best practices. These best practices correspond to pilot tests applied to several of the academic subjects conducted at UPM in order to validate the model. This work describes the general procedure that was used and presents the model developed specifically for the problem-solving competence. Some of the pilot experiences are also summarised and their results analysed
Resumo:
The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.
Resumo:
The Bioinstrumentation Laboratory belongs to the Centre for Biomedical Technology (CTB) of the Technical University of Madrid and its main objective is to provide the scientific community with devices and techniques for the characterization of micro and nanostructures and consequently finding their best biomedical applications. Hyperthermia (greek word for “overheating”) is defined as the phenomenon that occurs when a body is exposed to an energy generating source that can produce a rise in temperature (42-45ºC) for a given time [1]. Specifically, the aim of the hyperthermia methods used in The Bioinstrumentation Laboratory is the development of thermal therapies, some of these using different kinds of nanoparticles, to kill cancer cells and reduce the damage on healthy tissues. The optical hyperthermia is based on noble metal nanoparticles and laser irradiation. This kind of nanoparticles has an immense potential associated to the development of therapies for cancer on account of their Surface Plasmon Resonance (SPR) enhanced light scattering and absorption. In a short period of time, the absorbed light is converted into localized heat, so we can take advantage of these characteristics to heat up tumor cells in order to obtain the cellular death [2]. In this case, the laboratory has an optical hyperthermia device based on a continuous wave laser used to kill glioblastoma cell lines (1321N1) in the presence of gold nanorods (Figure 1a). The wavelength of the laser light is 808 nm because the penetration of the light in the tissue is deeper in the Near Infrared Region. The first optical hyperthermia results show that the laser irradiation produces cellular death in the experimental samples of glioblastoma cell lines using gold nanorods but is not able to decrease the cellular viability of cancer cells in samples without the suitable nanorods (Figure 1b) [3]. The generation of magnetic hyperthermia is performed through changes of the magnetic induction in magnetic nanoparticles (MNPs) that are embedded in viscous medium. The Figure 2 shows a schematic design of the AC induction hyperthermia device in magnetic fluids. The equipment has been manufactured at The Bioinstrumentation Laboratory. The first block implies two steps: the signal selection with frequency manipulation option from 9 KHz to 2MHz, and a linear output up to 1500W. The second block is where magnetic field is generated ( 5mm, 10 turns). Finally, the third block is a software control where the user can establish initial parameters, and also shows the temperature response of MNPs due to the magnetic field applied [4-8]. The Bioinstrumentation Laboratory in collaboration with the Mexican company MRI-DT have recently implemented a new research line on Nuclear Magnetic Resonance Hyperthermia, which is sustained on the patent US 7,423,429B2 owned by this company. This investigation is based on the use of clinical MRI equipment not only for diagnosis but for therapy [9]. This idea consists of two main facts: Magnetic Resonance Imaging can cause focal heating [10], and the differentiation in resonant frequency between healthy and cancer cells [11]. To produce only heating in cancer cells when the whole body is irradiated, it is necessary to determine the specific resonant frequency of the target, using the information contained in the spectra of the area of interest. Then, special RF pulse sequence is applied to produce fast excitation and relaxation mechanism that generates temperature increase of the tumor, causing cellular death or metabolism malfunction that stops cellular division
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
(Matsukawa and Habeck, 2007) analyse the main instruments for risk mitigation in infrastructure financing with Multilateral Financial Institutions (MFIs). Their review coincided with the global financial crisis of 2007-08, and is highly relevant in current times considering the sovereign debt crisis, the lack of available capital and the increases in bank regulation in Western economies. The current macroeconomic environment has seen a slowdown in the level of finance for infrastructure projects, as they pose a higher credit risk given their requirements for long term investments. The rationale for this work is to look for innovative solutions that are focused on the credit risk mitigation of infrastructure and energy projects whilst optimizing the economic capital allocation for commercial banks. This objective is achieved through risk-sharing with MFIs and looking for capital relief in project finance transactions. This research finds out the answer to the main question: "What is the impact of risk-sharing with MFIs on project finance transactions to increase their efficiency and viability?", and is developed from the perspective of a commercial bank assessing the economic capital used and analysing the relevant variables for it: Probability of Default, Loss Given Default and Recovery Rates, (Altman, 2010). An overview of project finance for the infrastructure and energy sectors in terms of the volume of transactions worldwide is outlined, along with a summary of risk-sharing financing with MFIs. A review of the current regulatory framework beneath risk-sharing in structured finance with MFIs is also analysed. From here, the impact of risk-sharing and the diversification effect in infrastructure and energy projects is assessed, from the perspective of economic capital allocation for a commercial bank. CreditMetrics (J. P. Morgan, 1997) is applied over an existing well diversified portfolio of project finance infrastructure and energy investments, working with the main risk capital measures: economic capital, RAROC, and EVA. The conclusions of this research show that economic capital allocation on a portfolio of project finance along with risk-sharing with MFIs have a huge impact on capital relief whilst increasing performance profitability for commercial banks. There is an outstanding diversification effect due to the portfolio, which is combined with risk mitigation and an improvement in recovery rates through Partial Credit Guarantees issued by MFIs. A stress test scenario analysis is applied to the current assumptions and credit risk model, considering a downgrade in the rating for the commercial bank (lender) and an increase of default in emerging countries, presenting a direct impact on economic capital, through an increase in expected loss and a decrease in performance profitability. Getting capital relief through risk-sharing makes it more viable for commercial banks to finance infrastructure and energy projects, with the beneficial effect of a direct impact of these investments on GDP growth and employment. The main contribution of this work is to promote a strategic economic capital allocation in infrastructure and energy financing through innovative risk-sharing with MFIs and economic pricing to create economic value added for banks, and to allow the financing of more infrastructure and energy projects. This work suggests several topics for further research in relation to issues analysed. (Matsukawa and Habeck, 2007) analizan los principales instrumentos de mitigación de riesgos en las Instituciones Financieras Multilaterales (IFMs) para la financiación de infraestructuras. Su presentación coincidió con el inicio de la crisis financiera en Agosto de 2007, y sus consecuencias persisten en la actualidad, destacando la deuda soberana en economías desarrolladas y los problemas capitalización de los bancos. Este entorno macroeconómico ha ralentizado la financiación de proyectos de infraestructuras. El actual trabajo de investigación tiene su motivación en la búsqueda de soluciones para la financiación de proyectos de infraestructuras y de energía, mitigando los riesgos inherentes, con el objeto de reducir el consumo de capital económico en los bancos financiadores. Este objetivo se alcanza compartiendo el riesgo de la financiación con IFMs, a través de estructuras de risk-sharing. La investigación responde la pregunta: "Cuál es el impacto de risk-sharing con IFMs, en la financiación de proyectos para aumentar su eficiencia y viabilidad?". El trabajo se desarrolla desde el enfoque de un banco comercial, estimando el consumo de capital económico en la financiación de proyectos y analizando las principales variables del riesgo de crédito, Probability of Default, Loss Given Default and Recovery Rates, (Altman, 2010). La investigación presenta las cifras globales de Project Finance en los sectores de infraestructuras y de energía, y analiza el marco regulatorio internacional en relación al consumo de capital económico en la financiación de proyectos en los que participan IFMs. A continuación, el trabajo modeliza una cartera real, bien diversificada, de Project Finance de infraestructuras y de energía, aplicando la metodología CreditMet- rics (J. P. Morgan, 1997). Su objeto es estimar el consumo de capital económico y la rentabilidad de la cartera de proyectos a través del RAROC y EVA. La modelización permite estimar el efecto diversificación y la liberación de capital económico consecuencia del risk-sharing. Los resultados muestran el enorme impacto del efecto diversificación de la cartera, así como de las garantías parciales de las IFMs que mitigan riesgos, mejoran el recovery rate de los proyectos y reducen el consumo de capital económico para el banco comercial, mientras aumentan la rentabilidad, RAROC, y crean valor económico, EVA. En escenarios económicos de inestabilidad, empeoramiento del rating de los bancos, aumentos de default en los proyectos y de correlación en las carteras, hay un impacto directo en el capital económico y en la pérdida de rentabilidad. La liberación de capital económico, como se plantea en la presente investigación, permitirá financiar más proyectos de infraestructuras y de energía, lo que repercutirá en un mayor crecimiento económico y creación de empleo. La principal contribución de este trabajo es promover la gestión activa del capital económico en la financiación de infraestructuras y de proyectos energéticos, a través de estructuras innovadoras de risk-sharing con IFMs y de creación de valor económico en los bancos comerciales, lo que mejoraría su eficiencia y capitalización. La aportación metodológica del trabajo se convierte por su originalidad en una contribución, que sugiere y facilita nuevas líneas de investigación académica en las principales variables del riesgo de crédito que afectan al capital económico en la financiación de proyectos.