27 resultados para horizons d’attente
Resumo:
Flows of relevance to new generation aerospace vehicles exist, which are weakly dependent on the streamwise direction and strongly dependent on the other two spatial directions, such as the flow around the (flattened) nose of the vehicle and the associated elliptic cone model. Exploiting these characteristics, a parabolic integration of the Navier-Stokes equations is more appropriate than solution of the full equations, resulting in the so-called Parabolic Navier-Stokes (PNS). This approach not only is the best candidate, in terms of computational efficiency and accuracy, for the computation of steady base flows with the appointed properties, but also permits performing instability analysis and laminar-turbulent transition studies a-posteriori to the base flow computation. This is to be contrasted with the alternative approach of using order-of-magnitude more expensive spatial Direct Numerical Simulations (DNS) for the description of the transition process. The PNS equations used here have been formulated for an arbitrary coordinate transformation and the spatial discretization is performed using a novel stable high-order finite-difference-based numerical scheme, ensuring the recovery of highly accurate solutions using modest computing resources. For verification purposes, the boundary layer solution around a circular cone at zero angle of attack is compared in the incompressible limit with theoretical profiles. Also, the recovered shock wave angle at supersonic conditions is compared with theoretical predictions in the same circular-base cone geometry. Finally, the entire flow field, including shock position and compressible boundary layer around a 2:1 elliptic cone is recovered at Mach numbers 3 and 4
Resumo:
En este proyecto se hace un análisis en profundidad de las técnicas de ataque a las redes de ordenadores conocidas como APTs (Advanced Persistent Threats), viendo cuál es el impacto que pueden llegar a tener en los equipos de una empresa y el posible robo de información y pérdida monetaria que puede llevar asociada. Para hacer esta introspección veremos qué técnicas utilizan los atacantes para introducir el malware en la red y también cómo dicho malware escala privilegios, obtiene información privilegiada y se mantiene oculto. Además, y cómo parte experimental de este proyecto se ha desarrollado una plataforma para la detección de malware de una red en base a las webs, URLs e IPs que visitan los nodos que la componen. Obtendremos esta visión gracias a la extracción de los logs y registros de DNS de consulta de la compañía, sobre los que realizaremos un análisis exhaustivo. Para poder inferir correctamente qué equipos están infectados o no se ha utilizado un algoritmo de desarrollo propio inspirado en la técnica Belief Propagation (“Propagación basada en creencia”) que ya ha sido usada antes por desarrolladores cómo los de los Álamos en Nuevo México (Estados Unidos) para fines similares a los que aquí se muestran. Además, para mejorar la velocidad de inferencia y el rendimiento del sistema se propone un algoritmo adaptado a la plataforma Hadoop de Apache, por lo que se modifica el paradigma de programación habitual y se busca un nuevo paradigma conocido como MapReduce que consiste en la división de la información en conceptos clave-valor. Por una parte, los algoritmos que existen basados en Belief Propagation para el descubrimiento de malware son propietarios y no han sido publicados completamente hasta la fecha, por otra parte, estos algoritmos aún no han sido adaptados a Hadoop ni a ningún modelo de programación distribuida aspecto que se abordará en este proyecto. No es propósito de este proyecto desarrollar una plataforma comercial o funcionalmente completa, sino estudiar el problema de las APTs y una implementación que demuestre que la plataforma mencionada es factible de implementar. Este proyecto abre, a su vez, un horizonte nuevo de investigación en el campo de la adaptación al modelo MapReduce de algoritmos del tipo Belief Propagation basados en la detección del malware mediante registros DNS. ABSTRACT. This project makes an in-depth investigation about problems related to APT in computer networks nowadays, seeing how much damage could they inflict on the hosts of a Company and how much monetary and information loss may they cause. In our investigation we will find what techniques are generally applied by attackers to inject malware into networks and how this malware escalates its privileges, extracts privileged information and stays hidden. As the main part of this Project, this paper shows how to develop and configure a platform that could detect malware from URLs and IPs visited by the hosts of the network. This information can be extracted from the logs and DNS query records of the Company, on which we will make an analysis in depth. A self-developed algorithm inspired on Belief Propagation technique has been used to infer which hosts are infected and which are not. This technique has been used before by developers of Los Alamos Lab (New Mexico, USA) for similar purposes. Moreover, this project proposes an algorithm adapted to Apache Hadoop Platform in order to improve the inference speed and system performance. This platform replaces the traditional coding paradigm by a new paradigm called MapReduce which splits and shares information among hosts and uses key-value tokens. On the one hand, existing algorithms based on Belief Propagation are part of owner software and they have not been published yet because they have been patented due to the huge economic benefits they could give. On the other hand these algorithms have neither been adapted to Hadoop nor to other distributed coding paradigms. This situation turn the challenge into a complicated problem and could lead to a dramatic increase of its installation difficulty on a client corporation. The purpose of this Project is to develop a complete and 100% functional brand platform. Herein, show a short summary of the APT problem will be presented and make an effort will be made to demonstrate the viability of an APT discovering platform. At the same time, this project opens up new horizons of investigation about adapting Belief Propagation algorithms to the MapReduce model and about malware detection with DNS records.
Resumo:
Dada la importancia de conocer la humedad del suelo de forma precisa y en tiempo real, se ha realizado este trabajo de investigación cuyo objetivo principal ha sido seleccionar un Balance Hídrico del Suelo (BHS) diario y validar sus estimaciones de humedad del suelo frente a medidas obtenidas “in situ”, aplicándolo a tres emplazamientos seleccionados en la zona centro con características edáficas y climáticas diferentes, y de este modo estimar con cierta precisión la humedad del suelo como Agua Disponible (AD) para las plantas y a su vez permitir la realización de estudios climáticos. Los observatorios meteorológicos seleccionados fueron: Guadalajara/El Serranillo en la zona aluvial del río Henares; Colmenar Viejo/Base Famet en la rampa sur del Guadarrama sobre rocas metamórficas; y Radiosondeo/Madrid(Barajas) en arenas arcósicas de grano grueso. Se realizó una caracterización morfológica y un estudio de las propiedades físicas, químicas e hidrofísicas de los suelos en cada emplazamiento. El suelo de Guadalajara, Xerorthent Típico presenta una secuencia genética de horizontes (Ap-AC-C1-C2) siendo su clase textural entre franco-arenosa a franca, con menos del 2% de elementos gruesos, presencia de caliza a lo largo de todo el perfil, destacando la homogeneidad en vertical y horizontal de sus propiedades. El suelo de Colmenar, Xerorthent Dystrico, presenta una secuencia genética de horizontes (A-C-C/R) apareciendo el horizonte C/R entre 20-30 cm; y la roca aproximadamente a unos 30 cm; destacando en este perfil su acidez y el alto contenido de elementos gruesos. El suelo de Radiosondeo, Haploxeralf Típico, presenta la secuencia normal de horizontes de los alfisoles (A-Bt1-Bt2-C/Bt); destacando su heterogeneidad principalmente en el plano horizontal, con presencia del Bt a diferentes profundidades en un corto espacio longitudinal. En una primera fase de experimentación (2007-2008) se seleccionaron BHS diarios que sólo utilizaban como datos de entrada la información de variables meteorológicas y el valor del Agua Disponible Total (ADT) para cada tipo de suelo y profundidad. Se probaron BHS diarios con agotamiento exponencial y directo de la reserva, utilizando la evapotranspiración de referencia de Penman-Monteith recomendada por FAO. Al mismo tiempo que se disponía de los datos estimados de humedad de suelo mediante diferentes BHS diarios en los tres emplazamientos, también se realizó una monitorización de la humedad del suelo “in situ” mediante el método gravimétrico, con adaptación de dicha metodología a la problemática de cada suelo, para determinar en cada fecha tanto la humedad del suelo como su contenido de AD para una profundidad de 0 a 30 cm. Se tomaron en cada fecha de muestreo 5 muestras para la profundidad 0- 10 cm, otras cinco para 10-20 cm y otras cinco para 20-30 cm, realizándose el correspondiente tratamiento estadístico de los datos. El ADT se calculó a partir de los datos de capacidad de campo y punto de marchitez obtenidos en laboratorio con membrana de Richards. Los resultados de esta primera fase permitieron conocer que el BHS exponencial diario era el que mejor estimaba el AD en Guadalajara considerando la capacidad de campo a una presión de 33 kPa, mientras que en Colmenar se debían considerar para un mejor ajuste, 10 kPa en lugar de 33 kPa. En el observatorio de Radiosondeo debido a que en cada fecha de muestreo la profundidad en la que aparecía el horizonte Bt era diferente, no se pudo demostrar si el BHS exponencial diario tenía un buen comportamiento. En una segunda fase de experimentación (2009-2012) y con el objeto de aminorar los problemas encontrados en Radiosondeo para la medida de humedad del suelo por el método gravimétrico, se procedió a la instalación y utilización de diferentes sensores de medida de humedad de suelo en el mismo observatorio: TDR (time domain reflectometry - TRIME T3 de IMKO); FDR capacitivo (frecuency domain reflectometry - ECH2O EC-20 de DECAGON) y otros. Esta segunda fase de experimentación tuvo una duración de 4 años y se compararon las medidas de humedad de suelo obtenidas a partir de los sensores con las estimadas del BHS exponencial hasta una profundidad de 0 a 85 cm. En laboratorio se realizaron calibraciones específicas de los sensores TDR y FDR para cada uno de los horizontes más diferenciados del Haploxeralf Típico, utilizando diferentes tipos de regresión. Los valores de humedad de suelo con el equipo TDR, corregidos mediante la calibración específica de laboratorio, fueron los que más se ajustaron a las medidas realizadas por método gravimétrico “in situ”, por lo que se utilizó el TDR para las comparaciones con los valores obtenidos del BHS exponencial diario durante los cuatro años de esta segunda fase experimental. Se realizaron diferentes estimaciones del ADT, partiendo de datos de laboratorio y/o de datos procedentes de humedad de los sensores en campo. Los resultados mostraron de nuevo la conveniencia de utilizar el BHS exponencial diario, pero en este caso, con la estimación del ADT realizada a partir de las gráficas de los sensores. Mediante la utilización de los datos de humedad del BHS exponencial diario se han realizado comparaciones con el mismo tipo de balance pero utilizando un periodo semanal o mensual en lugar de diario, para conocer las diferencias. Los valores obtenidos con periodicidad mensual han dado valores de AD inferiores a los balances calculados semanalmente o diariamente. Por último se ha comprobado que los resultados de un BHS exponencial diario pueden complementar la información que se obtiene del Índice de Precipitación Estandarizado (SPI) y pueden mejorar el estudio de la sequía agrícola. ABSTRACT Due to the importance of a better knowledge of soil water at real time and in a more precisely way, this research work has being carried out with the main objective of selecting a daily Soil Water Balance (SWB) to estimate soil water content, and validate it in comparison to “in situ” measurements. Three locations, differing in soil and climate characteristics, were selected in central Spain in order to estimate with certain acuity soil water as plant-Available Water (AW) and to serve as a tool for the climatic studies. The selected places near meteorology stations were: Guadalajara/El Serranillo an alluvium of the Henares watershed; Colmenar Viejo/Base Famet, in the south raised area of the Guadarrama river basin, over metamorphic rocks; and Radiosondeo/Madrid (Barajas) in coarse arkosic sandstone. Morphology characterization, physical, chemical and hydrologic soil properties were studied in each area. In Guadalajara the soil is a Typic Xerorthent with a (Ap-AC-C1- C2) genetic horizon sequence, loam-sandy to loam textural class, less than 2% of rock fragments, presence of equivalent CaCO3 through the whole profile, outstanding the vertical and horizontal homogeneity of the properties. In Colmenar the soil is represented by a Dystric Xerorthent with a (A-C-C/R) genetic horizon sequence, the C/R is 20-30 cm deep where rock outcrops are approximately at 30 cm; a characteristic feature of this profile is its high acidity and high rock fragments content. In Radiosondeo the soil is represented by a Typic Haploxeralf, with the usual alfisol genetic horizon sequence (A-Bt1-Bt2-C/Bt); outstanding its horizontal heterogeneity, “the variability of the Bt (clay enriched horizon) depth in short distances”. In a first experimental stage (2007-2008), the daily SWB chosen was that which only uses as input data the information from the meteorology variables and plant-Total Available Water (TAW) for each soil type and depth. Different daily SWB (with exponential or direct plant-Available Water depletion) were applied, using the Penman- Monteith reference evapotranspiration (ETo) recommended by FAO. At the same time as soil water content was estimated from the different daily SWB at the three locations, also soil water content was being monitored by “in situ” gravimetric methodology, adapting it to each soil characteristic, to determine every time soil water content and AW to a depth of 0 to 30 cm. In each sampling date, 5 samples for each depth were taken: 0-10 cm; 10-20 cm and 20-30 cm and the data were submitted to the corresponding statistical analysis. The TAW was calculated based on field capacity (FC) and permanent wilting point (PWP) data obtained from laboratory by the Richards pressure plate. Results from this first experimental stage show that the daily exponential SWB was the one which better estimated the AW in Guadalajara considering field capacity at -33 kPa, though in Colmenar, field capacity at -10 kPa must be considered instead of -33 kPa for a better estimation. In Radiosondeo due to the fact that the Bt horizon depth varied in different sampling dates, it could not be established if the daily exponential SWB had a good performance. In a second experimental stage (20019-2012) and with the objective of minimizing the problems encountered in Radiosondeo for measuring “in situ” soil water content by the gravimetric method, the installation of different sensors for measuring soil water content were established and used in the same field location: TDR (time domain reflectometry - TRIME T3 from IMKO), capacitance FDR (frecuency domain reflectometry - ECH2O EC-20 from DECAGON) and others. This second experimental stage lasted 4 years in order to compare the soil water measures from the sensors with the estimations by the exponential SWB form 0 to 85 cm soil depths. At laboratory, specific calibrations for TDR and FDR sensors of the Typic Haploxeralf more differentiated horizons were done using different types of regressions. The results showed that soil water data obtained by the TDR equipment, corrected by the specific laboratory calibration, best fitted to “in situ” gravimetric soil water measures. In this way TDR was used for comparing to the daily exponential SWB during the four years of this second experimentation stage. Various estimations for obtaining TAW were tested; based on laboratory data – and/or on the data obtained of the soil water content field sensors. Results confirmed again, the convenience of using the daily exponential SWB, though in this case, with the TAW obtained from the field sensors graphics. Soil water estimated by exponential SWB on daily basis was compared to weekly and monthly periods, in order to know their reliability. The results obtained for a monthly period gave less AW than the ones obtained in a weekly or daily period. Finally it has been proved that the results obtained from the exponential SWB in a daily bases can be used as a useful tool in order to give complementary information to the SPI (Precipitation Standardized Index) and to help in agricultural drought studies.
Resumo:
The main objective of this paper is the development and application of multivariate time series models for forecasting aggregated wind power production in a country or region. Nowadays, in Spain, Denmark or Germany there is an increasing penetration of this kind of renewable energy, somehow to reduce energy dependence on the exterior, but always linked with the increaseand uncertainty affecting the prices of fossil fuels. The disposal of accurate predictions of wind power generation is a crucial task both for the System Operator as well as for all the agents of the Market. However, the vast majority of works rarely onsider forecasting horizons longer than 48 hours, although they are of interest for the system planning and operation. In this paper we use Dynamic Factor Analysis, adapting and modifying it conveniently, to reach our aim: the computation of accurate forecasts for the aggregated wind power production in a country for a forecasting horizon as long as possible, particularly up to 60 days (2 months). We illustrate this methodology and the results obtained for real data in the leading country in wind power production: Denmark
Resumo:
El enriquecimiento del conocimiento sobre la Irradiancia Solar (IS) a nivel de superficie terrestre, así como su predicción, cobran gran interés para las Energías Renovables (ER) - Energía Solar (ES)-, y para distintas aplicaciones industriales o ecológicas. En el ámbito de las ER, el uso óptimo de la ES implica contar con datos de la IS en superficie que ayuden tanto, en la selección de emplazamientos para instalaciones de ES, como en su etapa de diseño (dimensionar la producción) y, finalmente, en su explotación. En este último caso, la observación y la predicción es útil para el mercado energético, la planificación y gestión de la energía (generadoras y operadoras del sistema eléctrico), especialmente en los nuevos contextos de las redes inteligentes de transporte. A pesar de la importancia estratégica de contar con datos de la IS, especialmente los observados por sensores de IS en superficie (los que mejor captan esta variable), estos no siempre están disponibles para los lugares de interés ni con la resolución espacial y temporal deseada. Esta limitación se une a la necesidad de disponer de predicciones a corto plazo de la IS que ayuden a la planificación y gestión de la energía. Se ha indagado y caracterizado las Redes de Estaciones Meteorológicas (REM) existentes en España que publican en internet sus observaciones, focalizando en la IS. Se han identificado 24 REM (16 gubernamentales y 8 redes voluntarios) que aglutinan 3492 estaciones, convirtiéndose éstas en las fuentes de datos meteorológicos utilizados en la tesis. Se han investigado cinco técnicas de estimación espacial de la IS en intervalos de 15 minutos para el territorio peninsular (3 técnicas geoestadísticas, una determinística y el método HelioSat2 basado en imágenes satelitales) con distintas configuraciones espaciales. Cuando el área de estudio tiene una adecuada densidad de observaciones, el mejor método identificado para estimar la IS es el Kriging con Regresión usando variables auxiliares -una de ellas la IS estimada a partir de imágenes satelitales-. De este modo es posible estimar espacialmente la IS más allá de los 25 km identificados en la bibliografía. En caso contrario, se corrobora la idoneidad de utilizar estimaciones a partir de sensores remotos cuando la densidad de observaciones no es adecuada. Se ha experimentado con el modelado de Redes Neuronales Artificiales (RNA) para la predicción a corto plazo de la IS utilizando observaciones próximas (componentes espaciales) en sus entradas y, los resultados son prometedores. Así los niveles de errores disminuyen bajo las siguientes condiciones: (1) cuando el horizonte temporal de predicción es inferior o igual a 3 horas, las estaciones vecinas que se incluyen en el modelo deben encentrarse a una distancia máxima aproximada de 55 km. Esto permite concluir que las RNA son capaces de aprender cómo afectan las condiciones meteorológicas vecinas a la predicción de la IS. ABSTRACT ABSTRACT The enrichment of knowledge about the Solar Irradiance (SI) at Earth's surface and its prediction, have a high interest for Renewable Energy (RE) - Solar Energy (SE) - and for various industrial and environmental applications. In the field of the RE, the optimal use of the SE involves having SI surface to help in the selection of sites for facilities ES, in the design stage (sizing energy production), and finally on their production. In the latter case, the observation and prediction is useful for the market, planning and management of the energy (generators and electrical system operators), especially in new contexts of smart transport networks (smartgrid). Despite the strategic importance of SI data, especially those observed by sensors of SI at surface (the ones that best measure this environmental variable), these are not always available to the sights and the spatial and temporal resolution desired. This limitation is bound to the need for short-term predictions of the SI to help planning and energy management. It has been investigated and characterized existing Networks of Weather Stations (NWS) in Spain that share its observations online, focusing on SI. 24 NWS have been identified (16 government and 8 volunteer networks) that implies 3492 stations, turning it into the sources of meteorological data used in the thesis. We have investigated five technical of spatial estimation of SI in 15 minutes to the mainland (3 geostatistical techniques and HelioSat2 a deterministic method based on satellite images) with different spatial configurations. When the study area has an adequate density of observations we identified the best method to estimate the SI is the regression kriging with auxiliary variables (one of them is the SI estimated from satellite images. Thus it is possible to spatially estimate the SI beyond the 25 km identified in the literature. Otherwise, when the density of observations is inadequate the appropriateness is using the estimates values from remote sensing. It has been experimented with Artificial Neural Networks (ANN) modeling for predicting the short-term future of the SI using observations from neighbor’s weather stations (spatial components) in their inputs, and the results are promising. The error levels decrease under the following conditions: (1) when the prediction horizon is less or equal than 3 hours the best models are the ones that include data from the neighboring stations (at a maximum distance of 55 km). It is concluded that the ANN is able to learn how weather conditions affect neighboring prediction of IS at such Spatio-temporal horizons.
Resumo:
The European Union has been promoting linguistic diversity for many years as one of its main educational goals. This is an element that facilitates student mobility and student exchanges between different universities and countries and enriches the education of young undergraduates. In particular,a higher degree of competence in the English language is becoming essential for engineers, architects and researchers in general, as English has become the lingua franca that opens up horizons to internationalisation and the transfer of knowledge in today’s world. Many experts point to the Integrated Approach to Contents and Foreign Languages System as being an option that has certain benefits over the traditional method of teaching a second language that is exclusively based on specific subjects. This system advocates teaching the different subjects in the syllabus in a language other than one’s mother tongue, without prioritising knowledge of the language over the subject. This was the idea that in the 2009/10 academic year gave rise to the Second Language Integration Programme (SLI Programme) at the Escuela Arquitectura Tecnica in the Universidad Politecnica Madrid (EUATM-UPM), just at the beginning of the tuition of the new Building Engineering Degree, which had been adapted to the European Higher Education Area (EHEA) model. This programme is an interdisciplinary initiative for the set of subjects taught during the semester and is coordinated through the Assistant Director Office for Educational Innovation. The SLI Programme has a dual goal; to familiarise students with the specific English terminology of the subject being taught, and at the same time improve their communication skills in English. A total of thirty lecturers are taking part in the teaching of eleven first year subjects and twelve in the second year, with around 120 students who have voluntarily enrolled in a special group in each semester. During the 2010/2011 academic year the degree of acceptance and the results of the SLI Programme are being monitored. Tools have been designed to aid interdisciplinary coordination and to analyse satisfaction, such as coordination records and surveys. The results currently available refer to the first semester of the year and are divided into specific aspects of the different subjects involved and into general aspects of the ongoing experience.
Resumo:
The W3C Best Practises for Multilingual Linked Open Data community group was born one year ago during the last MLW workshop in Rome. Nowadays, it continues leading the effort of a numerous community towards acquiring a shared view of the issues caused by multilingualism on the Web of Data and their possible solutions. Despite our initial optimism, we found the task of identifying best practises for ML-LOD a difficult one, requiring a deep understanding of the Web of Data in its multilingual dimension and in its practical problems. In this talk we will review the progresses of the group so far, mainly in the identification and analysis of topics, use cases, and design patterns, as well as the future challenges.
Resumo:
Article New Forests November 2015, Volume 46, Issue 5, pp 869-883 First online: 17 June 2015 Establishing Quercus ilex under Mediterranean dry conditions: sowing recalcitrant acorns versus planting seedlings at different depths and tube shelter light transmissionsJuan A. OlietAffiliated withDepartamento de Sistemas y Recursos Naturales, E.T.S. Ingenieros de Montes, Universidad Politécnica de Madrid Email author View author's OrcID profile , Alberto Vázquez de CastroAffiliated withDepartamento de Sistemas y Recursos Naturales, E.T.S. Ingenieros de Montes, Universidad Politécnica de Madrid, Jaime PuértolasAffiliated withLancaster Environment Centre, Lancaster University $39.95 / €34.95 / £29.95 * Rent the article at a discount Rent now * Final gross prices may vary according to local VAT. Get Access AbstractSuccess of Mediterranean dry areas restoration with oaks is a challenging goal. Testing eco-techniques that mimic beneficial effects of natural structures and ameliorate stress contributes to positive solutions to overcoming establishment barriers. We ran a factorial experiment in a dry area, testing two levels of solid wall transmission of tube shelters (60 and 80 %) plus a control mesh, and two depths (shallow and 15 cm depth) of placing either planted seedlings or acorns of Quercus ilex. Microclimate of the planting or sowing spots was characterized by measuring photosynthetically active radiation, temperature and relative humidity. Plant response was evaluated in terms of survival, phenology, acorn emergence and photochemical efficiency (measured through chlorophyll fluorescence). We hypothesize that tube shelters and deep planting improve Q. ilex post-planting and sowing performance because of the combined effects of reducing excessive radiation and improving access to moist soil horizons. Results show that temperature and PAR was reduced, and relative humidity increased, in deep spots. Midsummer photochemical efficiency indicates highest level of stress for oaks in 80 % light transmission shelter. Optimum acorn emergence in spring was registered within solid wall tree shelters, and maximum summer survival of germinants and of planted seedlings occurred when acorns or seedlings were placed at 15 cm depth irrespectively of light transmission of shelter. Survival of germinants was similar to that of planted seedlings. The importance of techniques to keep high levels of viability after sowing recalcitrant seeds in the field is emphasized in the study
Resumo:
En este documento está descrito detalladamente el trabajo realizado para completar todos objetivos marcados para este Trabajo de Fin de Grado, que tiene como meta final el desarrollo de un dashboard configurable de gestión y administración para instancias de OpenStack. OpenStack es una plataforma libre y de código abierto utilizada como solución de Infraestructura como Servicio (Infrastructure as a Service, IaaS) en clouds tanto públicos, que ofrecen sus servicios cobrando el tiempo de uso o los recursos utilizados, como privados para su utilización exclusiva en el entorno de una empresa. El proyecto OpenStack se inició como una colaboración entre la NASA y RackSpace, y a día de hoy es mantenido por las empresas más potentes del sector tecnológico a través de la Fundación OpenStack. La plataforma OpenStack permite el acceso a sus servicios a través de una Interfaz de Linea de Comandos (Command Line Interface, CLI), una API RESTful y una interfaz web en forma de dashboard. Esta última es ofrecida a través del servicio Horizon. Este servicio provee de una interfaz gráfica para acceder, gestionar y automatizar servicios basados en cloud. El dashboard de Horizon presente algunos problemas como que: solo admite opciones de configuración mediante código Python, lo que hace que el usuario no tenga ninguna capacidad de configuración y que el administrador esté obligado a interactuar directamente con el código. no tiene soporte para múltiples regiones que permitan que un usuario pueda distribuir sus recursos por distintos centros de datos en diversas localizaciones como más le convenga. El presente Trabajo de Fin de Grado, que es la fase inicial del proyecto FI-Dash, pretende solucionar estos problemas mediante el desarrollo de un catálogo de widget de la plataformaWireCloud que permitirán al usuario tener todas las funcionalidades ofrecidas por Horizon a la vez que le ofrecen capacidades de configuración y añaden funcionalidades no presentes en Horizon como el soporte de múltiples regiones. Como paso previo al desarrollo del catálogo de widgets se ha llevado a cabo un estudio de las tecnologías y servicios ofrecidos por OpenStack, así como de las herramientas que pudieran ser necesarias para la realización del trabajo. El proceso de desarrollo ha sido dividido en distintas fases de acuerdo con los distintos componentes que forman parte del dashboard cada uno con una funcion de gestion sobre un tipo de recurso distinto. Las otras fases del desarrollo han sido la integración completa del dashboard en la plataforma WireCloud y el diseño de una interfaz gráfica usable y atractiva.---ABSTRACT---Throughout this document it is described the work performed in order to achieve all of the objectives set for this Final Project, which has as its main goal the development of a configurable dashboard for managing and administrating OpenStack instances. OpenStack is a free and open source platform used as Infrastructure as a Service (IaaS) for both public clouds, which offer their services through payments on time or resources used, and private clouds for use only in the company’s environment. The OpenStack project started as a collaboration between NASA and Rackspace, and nowadays is maintained by the most powerful companies in the technology sector through the OpenStack Foundation. The OpenStack project provides access to its services through a Command Line Interface (CLI), a RESTful API and a web interface as dashboard. The latter is offered through a service called Horizon. This service provides a graphical interface to access, manage and automate cloud-based services. Horizon’s dashboard presents some problems such as: Only supports configuration options using Python code, which grants the user no configuration capabilities and forces the administrator to interact directly. No support for multiple regions that allow a user to allocate his resources by different data centers in different locations at his convenience. This Final Project, which is the initial stage of the FI-Dash project, aims to solve these problems by developing a catalog of widgets for the WireCloud platform that will allow the user to have all the features offered by Horizon while offering configuration capabilities and additional features not present in Horizon such as support for multiple regions. As a prelude to the development of the widget catalog, a study of technologies and services offered by OpenStack as well as tools that may be necessary to carry out the work has been conducted. The development process has been split in phases matching the different components that are part of the dashboard, having each one of them a function of management of one kind of resource. The other development phases have been the achieving of full integration with WireCloud and the design of a graphical interface that is both usable and atractive.
Resumo:
Languages for Specific Languages (LSP) represent a dynamic approach both in research and practice and, as such, it is in constant evolution. It was earlier related to the use of English as an international language of communication in business and technology and thus designated as ESP (English for Specific Purposes). In Genre Analysis, Swales (1990) brought in new horizons with the notions of genre and discourse community. Thereafter, research on LSP learning and discourse have thrived over a large range of thematic contents and methods. Current Trends in LSP Research: Aims and Methods can be inserted in this latest streak
Resumo:
Esta investigación es una incursión en el tránsito de Juan Navarro desde sus “habitaciones y horizontes”, las manifestaciones espontáneas de la mano y el proyectar, a la obra que nos devuelve a la experiencia física y corporal del mundo. Juan Navarro debe a sus manos gran parte de sus inquietudes y capacidades. Sus manos están presentes en su obra como materia –piezas de manos-, como herramienta -el dibujo por la mano- y como desencadenante en los procesos creativos de su obra de arquitectura. Las distintas obras remiten a una preocupación común: la de visualizar el espacio a través de un imaginario personal. Sin embargo, el proceso creativo en cada disciplina se desarrolla teniendo en cuenta la especificidad del medio y la experiencia que provoca en el espectador. La obra, como concreción del proceso creativo, se explica por las continuidades y discontinuidades entre las herramientas, mecanismos y estrategias utilizadas en los distintos medios. La tesis se estructura en dos partes, en la primera se estudia cómo se producen los procesos creativos, sus mecanismos en los distintos medios plásticos y el dibujo como herramienta transversal. Se identifican los conceptos y temas que dan lugar a la obra profundizando en el papel de la mano como presencia orgánica, biológica y responsable de una forma de representación personal. La segunda parte se articula en dos capítulos que, a través del dibujo, muestran la arquitectura como modelo e identifican los mecanismos utilizados en su forma de proyectar y su relación con la obra en distintos proyectos. El texto se estructura como una secuencia de ideas articuladas alrededor de un universo gráfico que nos conduce por múltiples itinerarios desde los que atisbar los procesos creativos de Juan Navarro. Estos caminos son hilos con los que se teje una visión personal de la relación entre las herramientas y mecanismos utilizados por Juan Navarro y su obra. La manera cómo se produce el proceso creativo, los mecanismos y las herramientas que los ponen en marcha constituyen una forma de abordar la obra, que hasta la fecha, se ha tratado aisladamente sin una intención de construir un cuerpo estructurado de conocimiento. En la arquitectura Juan Navarro existe un vacío de conocimiento teórico y gráfico sobre el propio proceso y su forma de proyectar. Se ha persistido en la explicación de la obra, sus referencias, temas abordados, relaciones y trasvases sin ahondar en la especificidad del medio. Estos vacíos establecen la necesidad y justificación de esta tesis doctoral. La investigación comienza descifrando una obra que desde sus inicios trabaja con la dualidad de lo gestual y lo conceptual. Plantea una forma de ordenación del mundo, de la sensación sometida a la medida en la que finalmente la obra se recibe como signo que desencadena sentimientos y te devuelve al mundo. Propone la recuperación de los sentidos a través de una arquitectura como vivencia no reductible al espacio geométrico. Identifica los mecanismos y herramientas que se establecen en este proceso y termina concluyendo que el dibujo es la herramienta doblemente transversal porque atiende de forma desigual a las distintas disciplinas y a los dos extremos en que se presenta la actividad creatividad en el trabajo de Juan Navarro. Estos extremos se corresponden con un conocimiento corporal inconsciente y un trabajo constante guiado por la motivación, la predeterminación y la conceptualización. El dibujo por la mano es el espacio de encuentro entre lo que representa la mano y la posibilidad de expresión proyectual codificada. Se produce en un territorio que se extiende desde lo analógico subyacente –que se nutre de imágenes complejas- y el dominio simbólico construido. Abstract. This research is a foray into Juan Navarro’s transition from his "Rooms and Horizons", -spontaneous demonstrations of the hand-, to the project, -the work that brings us back to experiencing the physical world-. Juan Navarro owes his hands much of his capacities and inquisitiveness. His hands are present in his work as the subject –“Hand Pieces”-, as a tool -through hand-drawing - and as a trigger in the creative processes of his work of architecture. The various works refer to a common theme: the viewing space through a personal imagery. However, the creative process in each discipline develops taking into account the specificity of the medium and the experience that arouses in the observer. The work, as completion of the creative process, is explained by the continuities and discontinuities between the tools, mechanisms and strategies used in the different media. The thesis is structured in two parts, the first studies how the creative processes are iniciated, their mechanisms in the different plastic art media, as well as drawing as a transversal tool. In this section the investigation identifies the concepts and themes that give rise to the art work, exploring the role of the hand as the organic, biological presence responsible for a way of personal representation. The second part is divided into two chapters, which, via the drawing, show the architecture as a model and identify the mechanisms used in the his way of projecting form of the project and its the relationship of hand-drawing to with his work ilustrated with different projects. The text is structured as a sequence of ideas, articulated around a graphic universe that leads us by multiple paths, letting us glimpse into the creative processes of Juan Navarro. These paths are threads that weave a personal vision of the relationship between the tools and mechanisms used by Juan Navarro in his work. The way the creative process takes place, the mechanisms and tools that set it in action, constitutes a way of dealing with the work, that, hithergo, has been treated in isolation without an intention to build a structured body of knowledge. In Navarro Baldeweg’s architecture there is a vacuum of theoretical and graphic knowledge of the process itself and his way of projecting. So far, emphasis has been placed mainly on his work’s explanation, its references, the subjects covered, connections and transfers, without delving into the specificity of each medium. These academic gaps justify the need for this doctoral thesis. The investigation begins deciphering a work that, since its very beginning, deals with the duality of the gesture and the concept. It poses a way of managing the world and the sensations which are submitted until it finally detects the work as a sign that triggers feelings and returns the observer to the real world. It proposes the recovery of the senses through an architecture that is sensed as an experience and not merely reduced to geometric space. It identifies the mechanisms and tools that are set out in this process and concludes that drawing is a core tool working in two directions, because it caters unevenly to the various disciplines and to both ends of the creative activity presented in the work of Juan Navarro. These ends correspond with an unconscious physical knowledge and a continuous work guided by motivation, predetermination and conceptualization. Hand-drawing is the meeting space between what the hand represents, and the possibility of an encoded, projectual expression. Thus, hand-drawing takes place in a territory that covers the underlying analogue - which feeds on complex images - to the symbolic built domain.
Resumo:
The emergence of new horizons in the field of travel assistant management leads to the development of cutting-edge systems focused on improving the existing ones. Moreover, new opportunities are being also presented since systems trend to be more reliable and autonomous. In this paper, a self-learning embedded system for object identification based on adaptive-cooperative dynamic approaches is presented for intelligent sensor’s infrastructures. The proposed system is able to detect and identify moving objects using a dynamic decision tree. Consequently, it combines machine learning algorithms and cooperative strategies in order to make the system more adaptive to changing environments. Therefore, the proposed system may be very useful for many applications like shadow tolls since several types of vehicles may be distinguished, parking optimization systems, improved traffic conditions systems, etc.