878 resultados para global positioning system
Resumo:
Think piece by Pierre Sauvé for the E15 Initiative on Strengthening the Global Trade System In his latest essay for the ICTSD-World Economic Forum E15 initiative on Strengthening the Global Trade and Investment System for Sustainable Development, WTI Director of External Programmes and Academic Partnerships and faculty member Pierre Sauvé explores the case for fusing the law of goods with that of services in a world of global value chains. The paper does so by directing attention to the questions of whether the current architectures of multilateral and preferential trade governance are compatible with a world of trade in tasks; whether the existing rules offer globally active firms a coherent structure for doing business in a predictable environment; whether it is feasible to redesign the structure and content of existing trade rules to align them to the reality of production fragmentation; and what steps can be envisaged to better align policy and realities in the marketplace if the prospects for restructuring appear unfavourable. The paper argues that fusing trade disciplines for goods and services is neither needed nor feasible and may actually deflect attention from a number of worthwhile policy initiatives where more realistic (if never easily secured) prospects of generic rule-making may well exist.
Resumo:
El objetivo de esta tesis es el desarrollo de un sistema completo de navegación, aprendizaje y planificación para un robot móvil. Dentro de los innumerables problemas que este gran objetivo plantea, hemos dedicado especial atención al problema del conocimiento autónomo del mundo. Nuestra mayor preocupación ha sido la de establecer mecanismos que permitan, a partir de información sensorial cruda, el desarrollo incremental de un modelo topológico del entorno en el que se mueve el robot. Estos mecanismos se apoyan invariablemente en un nuevo concepto propuesto en esta tesis: el gradiente sensorial. El gradiente sensorial es un dispositivo matemático que funciona como un detector de sucesos interesantes para el sistema. Una vez detectado uno de estos sucesos, el robot puede identificar su situación en un mapa topológico y actuar en consecuencia. Hemos denominado a estas situaciones especiales lugares sensorialmente relevantes, ya que (a) captan la atención del sistema y (b) pueden ser identificadas utilizando la información sensorial. Para explotar convenientemente los modelos construidos, hemos desarrollado un algoritmo capaz de elaborar planes internalizados, estableciendo una red de sugerencias en los lugares sensorialmente relevantes, de modo que el robot encuentra en estos puntos una dirección recomendada de navegación. Finalmente, hemos implementado un sistema de navegación robusto con habilidades para interpretar y adecuar los planes internalizados a las circunstancias concretas del momento. Nuestro sistema de navegación está basado en la teoría de campos de potencial artificial, a la que hemos incorporado la posibilidad de añadir cargas ficticias como ayuda a la evitación de mínimos locales. Como aportación adicional de esta tesis al campo genérico de la ciencia cognitiva, todos estos elementos se integran en una arquitectura centrada en la memoria, lo que pretende resaltar la importancia de ésta en los procesos cognitivos de los seres vivos y aporta un giro conceptual al punto de vista tradicional, centrado en los procesos. The general objective of this thesis is the development of a global navigation system endowed with planning and learning features for a mobile robot. Within this general objective we have devoted a special effort to the autonomous learning problem. Our main concern has been to establish the necessary mechanisms for the incremental development of a topological model of the robot’s environment using the sensory information. These mechanisms are based on a new concept proposed in the thesis: the sensory gradient. The sensory gradient is a mathematical device which works like a detector of “interesting” environment’s events. Once a particular event has been detected the robot can identify its situation in the topological map and to react accordingly. We have called these special situations relevant sensory places because (a) they capture the system’s attention and (b) they can be identified using the sensory information. To conveniently exploit the built-in models we have developed an algorithm able to make internalized plans, establishing a suggestion network in the sensory relevant places in such way that the robot can find at those places a recommended navigation direction. It has been also developed a robust navigation system able to navigate by means of interpreting and adapting the internalized plans to the concrete circumstances at each instant, i.e. a reactive navigation system. This reactive system is based on the artificial potential field approach with the additional feature introduced in the thesis of what we call fictitious charges as an aid to avoid local minima. As a general contribution of the thesis to the cognitive science field all the above described elements are integrated in a memory-based architecture, emphasizing the important role played by the memory in the cognitive processes of living beings and giving a conceptual turn in the usual process-based approach.
Resumo:
Augmented reality (AR) is been increasingly used in mobile devices. Most of the available applications are set to work outdoors, mainly due to the availability of a reliable positioning system. Nevertheless, indoor (smart) spaces offer a lot of opportunities of creating new service concepts. In particular, in this paper we explore the applicability of mobile AR to hospitality environments (hotels and similar establishments). From the state-of-the-art of technologies and applications, a portfolio of services has been identified and a prototype using off-the-shelf technologies has been designed. Our objective is to identify the next technological challenges to overcome in order to have suitable underlying infrastructures and innovative services which enhance the traveller?s experience.
Resumo:
The Instituto Geográfico Nacional de España, thought its geodesy department, since 1997 has carried out the establisment of a GPS Reference Station Network (ERGPS) delivered all around Spain which allows millimetric co-ordinate results, as well as velocity fields in a Global Reference System (ITRFxx). It serves as support for other geodetic networks. Some of these stations are being integrated into the EUREF (EUropean REference Frame) Permanent Station Network. The ERGPS forms the zero order of the Spanish new geodesy
Resumo:
Population growth, economic globalization, improving living standards and urbanization are causing important changes in the global food system and modifying the dietary habits in many parts of the world (Molden, 2007; Godfray et al., 2010). The nutritional transition (linked to the development of countries and the increasing wealth of its population) implies a shift away from traditional staple food such as roots and tuber vegetables and a rise in consumption of meat and milk products, refined and processed foods, as well as sugars, oils and fats (Ambler-Edwards et al., 2009). The contemporary food system puts significant pressure on natural resources, especially on land and water, because the growing food demand pushes the agricultural frontier beyond, causing large impacts on ecosystems (Ambler-Edwards et al. 2009: 11-18). Also, the trend towards richer diets in animal proteins and processed food adds further pressure on the environment, since it requires larger amount of water and land to be produced (Allan, 2011; Mekonnen and Hoekstra, 2012).
Resumo:
Sustainability and the food-water-environment nexus. Food-water linkages in global agro-economic models. The CAPRI water module. Potential to jointly assess food and water policies. Pilot case study. Further development.
Resumo:
El cambio climático y los diferentes aspectos del concepto de “desarrollo” están intrínsecamente interconectados. Por un lado, el desarrollo económico de nuestras sociedades ha contribuido a un aumento insostenible de las emisiones de gases de efecto invernadero, las cuales están desestabilizando el sistema climático global, generando al mismo tiempo una distribución desigual de la capacidad de las personas para hacer frente a estos cambios. Por otro lado, en la actualidad existe un amplio consenso sobre que el cambio climático impacta directamente y de manera negativa sobre el denominando desarrollo sostenible. De igual manera, cada vez existe un mayor consenso de que el cambio climático va a desafiar sustancialmente nuestra capacidad de erradicar la pobreza a medio y largo plazo. Ante esta realidad, no cabe duda de que las estrategias de adaptación son esenciales para mantener el desarrollo. Es por esto que hasta el momento, los mayores esfuerzos realizados por unir las agendas globales de la lucha contra la pobreza y del cambio climático se han dado en el entorno de la adaptación al cambio climático. Sin embargo, cada vez son más los actores que defienden, desde distintos escenarios, que existen sinergias entre la mitigación de emisiones y la mejora de las condiciones de vida de las poblaciones más vulnerables, favoreciendo así un “desarrollo sostenible” sin disminuir los recursos financieros destinados a la adaptación. Para hacer efectivo este potencial, es imprescindible identificar diseños de estrategias de mitigación que incrementen los resultados de desarrollo, contribuyendo al desarrollo sostenible al mismo tiempo que a reducir la pobreza. En este contexto se sitúa el objetivo principal de esta investigación, consistente en analizar los co-beneficios locales, para el desarrollo sostenible y la reducción la pobreza, de proyectos de mitigación del cambio climático que se implementan en Brasil. Por co-beneficios se entienden, en el lenguaje de las discusiones internacionales de cambio climático, aquellos beneficios que van más allá de la reducción de emisiones de Gases de Efecto Invernadero (GEI) intrínsecas por definición a los proyectos de mitigación. Los proyectos de mitigación más relevantes hasta el momento bajo el paraguas de la Convención Marco de las Naciones Unidas para el Cambio Climático (CMNUCC), son los denominados Mecanismos de Desarrollo Limpio (MDL) del Protocolo de Kioto. Sin embargo, existen alternativas de proyectos de mitigación (tales como los denominados “estándares adicionales” a los MDL de los Mercados Voluntarios de Carbono y las Tecnologías Sociales), que también serán tenidos en cuenta en el marco de este estudio. La elección del tema se justifica por la relevancia del mismo en un momento histórico en el que se está decidiendo el futuro del régimen climático a partir del año 2020. Aunque en el momento de redactar este documento, todavía no se ha acordado la forma que tendrán los futuros instrumentos de mitigación, sí que se sabe que los co-beneficios de estos instrumentos serán tan importantes, o incluso más, que las reducciones de GEI que generan. Esto se debe, principalmente, a las presiones realizadas en las negociaciones climáticas por parte de los países menos desarrollados, para los cuales el mayor incentivo de formar parte de dichas negociaciones se basa principalmente en estos potenciales co-beneficios. Los resultados de la tesis se estructuran alrededor de tres preguntas de investigación: ¿cómo están contribuyendo los MDL implementados en Brasil a generar co-beneficios que fomenten el desarrollo sostenible y reduzcan la pobreza?; ¿existen proyectos de mitigación en Brasil que por tener compromisos más exigentes en cuanto a su contribución al desarrollo sostenible y/o la reducción de la pobreza que los MDL estén siendo más eficientes que estos en relación a los co-beneficios que generan?; y ¿qué características de los proyectos de mitigación pueden resultar críticas para potenciar sus co-beneficios en las comunidades en las que se implementan? Para dar respuesta a estas preguntas, se ha desarrollado durante cuatro años una labor de investigación estructurada en varias fases y en la que se combinan diversas metodologías, que abarcan desde el desarrollo de un modelo de análisis de cobeneficios, hasta la aplicación del mismo tanto a nivel documental sobre 194 documentos de diseño de proyecto (denominado análisis ex-ante), como a través de 20 casos de estudio (denominado análisis ex-post). Con la realización de esta investigación, se ha confirmado que los requisitos existentes hasta el momento para registrar un proyecto como MDL bajo la CMNUCC no favorecen sustancialmente la generación de co-beneficios locales para las comunidades en las que se implementan. Adicionalmente, se han identificado prácticas y factores, que vinculadas a las actividades intrínsecas de los proyectos de mitigación, son efectivas para incrementar sus co-beneficios. Estas prácticas y factores podrán ser tenidas en cuenta tanto para mejorar los requisitos de los actuales proyectos MDL, como para apoyar la definición de los nuevos instrumentos climáticos. ABSTRACT Climate change and development are inextricably linked. On the one hand, the economic development of our societies has contributed to the unsustainable increase of Green House Gases emissions, which are destabilizing the global climate system while fostering an unequal distribution of people´s ability to cope with these changes. On the other hand, there is now a consensus that climate change directly impacts the so-called sustainable development. Likely, there is a growing agreement that climate change will substantially threaten our capacity to eradicate poverty in the medium and long term. Given this reality, there is no doubt that adaptation strategies are essentials to keep development. This is why, to date, much of the focus on poverty in the context of climate change has been on adaptation However, without diverting resources from adaptation, there may exist the potential to synergize efforts to mitigate emissions, contribute to sustainable development and reduce poverty. To fulfil this potential, it is key identifying how mitigation strategies can also support sustainable development and reduce poverty. In this context, the main purpose of this investigation is to explore the co-benefits, for sustainable development and for poverty reduction, of climate change mitigation projects being implemented in Brazil. In recent years the term co-benefits, has been used by policy makers and academics to refer the potentially large and diverse range of collateral benefits that can be associated with climate change mitigation policies in addition to the direct avoided climate impact benefits. The most relevant mitigation projects developed during the last years under the United Nations Framework Convention on Climate Change (UNFCCC) are the so-called Clean Development Mechanisms (CDM) of the Kyoto Protocol. Thus, this research will analyse this official mechanism. However, there are alternatives to the mitigation projects (such as the "add-on standards" of the Voluntary Carbon Markets, and the Social Technologies) that will also be assessed as part of the research. The selection of this research theme is justified because its relevance in a historic moment in which proposals for a future climate regime after 2020 are being negotiated. Although at the moment of writing this document, there is not a common understanding on the shape of the new mitigation instruments, there is a great agreement about the importance of the co-benefits of such instruments, which may be even more important for the Least Developed Countries that their expected greenhouse gases emissions reductions. The results of the thesis are structured around three research questions: how are the CDM projects being implemented in Brazil generating local co-benefits that foster sustainable development and poverty reduction?; are other mitigation projects in Brazil that due to their more stringent sustainable development and/o poverty reduction criteria, any more successful at delivering co-benefits than regular CDM projects?; and what are the distinguishing characteristics of mitigation projects that are successful at delivering co-benefits? To answer these research questions, during four years it has been developed a research work structured in several phases and combining various methodologies. Those methodologies cover from the development of a co-benefits assessment model, to the application of such model both to a desktop analysis of 194 project design documents, and to 20 case studies using field data based on site visits to the project sites. With the completion of this research, it has been confirmed that current requirements to register a CDM project under the UNFCCC not substantially favour co-benefits at the local level. In addition, some practices and factors enablers of co-benefits have been identified. These characteristics may be taken into consideration to improve the current CDM and to support the definition of the new international market mechanisms for climate mitigation.
Contribución a la caracterización espacial de canales con sistemas MIMO-OFDM en la banda de 2,45 Ghz
Resumo:
La tecnología de múltiples antenas ha evolucionado para dar soporte a los actuales y futuros sistemas de comunicaciones inalámbricas en su afán por proporcionar la calidad de señal y las altas tasas de transmisión que demandan los nuevos servicios de voz, datos y multimedia. Sin embargo, es fundamental comprender las características espaciales del canal radio, ya que son las características del propio canal lo que limita en gran medida las prestaciones de los sistemas de comunicación actuales. Por ello surge la necesidad de estudiar la estructura espacial del canal de propagación para poder diseñar, evaluar e implementar de forma más eficiente tecnologías multiantena en los actuales y futuros sistemas de comunicación inalámbrica. Las tecnologías multiantena denominadas antenas inteligentes y MIMO han generado un gran interés en el área de comunicaciones inalámbricas, por ejemplo los sistemas de telefonía celular o más recientemente en las redes WLAN (Wireless Local Area Network), principalmente por la mejora que proporcionan en la calidad de las señales y en la tasa de transmisión de datos, respectivamente. Las ventajas de estas tecnologías se fundamentan en el uso de la dimensión espacial para obtener ganancia por diversidad espacial, como ya sucediera con las tecnologías FDMA (Frequency Division Multiplexing Access), TDMA (Time Division Multiplexing Access) y CDMA (Code Division Multiplexing Access) para obtener diversidad en las dimensiones de frecuencia, tiempo y código, respectivamente. Esta Tesis se centra en estudiar las características espaciales del canal con sistemas de múltiples antenas mediante la estimación de los perfiles de ángulos de llegada (DoA, Direction-of- Arrival) considerando esquemas de diversidad en espacio, polarización y frecuencia. Como primer paso se realiza una revisión de los sistemas con antenas inteligentes y los sistemas MIMO, describiendo con detalle la base matemática que sustenta las prestaciones ofrecidas por estos sistemas. Posteriormente se aportan distintos estudios sobre la estimación de los perfiles de DoA de canales radio con sistemas multiantena evaluando distintos aspectos de antenas, algoritmos de estimación, esquemas de polarización, campo lejano y campo cercano de las fuentes. Así mismo, se presenta un prototipo de medida MIMO-OFDM-SPAA3D en la banda ISM (Industrial, Scientific and Medical) de 2,45 Ghz, el cual está preparado para caracterizar experimentalmente el rendimiento de los sistemas MIMO, y para caracterizar espacialmente canales de propagación, considerando los esquemas de diversidad espacial, por polarización y frecuencia. Los estudios aportados se describen a continuación. Los sistemas de antenas inteligentes dependen en gran medida de la posición de los usuarios. Estos sistemas están equipados con arrays de antenas, los cuales aportan la diversidad espacial necesaria para obtener una representación espacial fidedigna del canal radio a través de los perfiles de DoA (DoA, Direction-of-Arrival) y por tanto, la posición de las fuentes de señal. Sin embargo, los errores de fabricación de arrays así como ciertos parámetros de señal conlleva un efecto negativo en las prestaciones de estos sistemas. Por ello se plantea un modelo de señal parametrizado que permite estudiar la influencia que tienen estos factores sobre los errores de estimación de DoA, tanto en acimut como en elevación, utilizando los algoritmos de estimación de DOA más conocidos en la literatura. A partir de las curvas de error, se pueden obtener parámetros de diseño para sistemas de localización basados en arrays. En un segundo estudio se evalúan esquemas de diversidad por polarización con los sistemas multiantena para mejorar la estimación de los perfiles de DoA en canales que presentan pérdidas por despolarización. Para ello se desarrolla un modelo de señal en array con sensibilidad de polarización que toma en cuenta el campo electromagnético de ondas planas. Se realizan simulaciones MC del modelo para estudiar el efecto de la orientación de la polarización como el número de polarizaciones usadas en el transmisor como en el receptor sobre la precisión en la estimación de los perfiles de DoA observados en el receptor. Además, se presentan los perfiles DoA obtenidos en escenarios quasiestáticos de interior con un prototipo de medida MIMO 4x4 de banda estrecha en la banda de 2,45 GHz, los cuales muestran gran fidelidad con el escenario real. Para la obtención de los perfiles DoA se propone un método basado en arrays virtuales, validado con los datos de simulación y los datos experimentales. Con relación a la localización 3D de fuentes en campo cercano (zona de Fresnel), se presenta un tercer estudio para obtener con gran exactitud la estructura espacial del canal de propagación en entornos de interior controlados (en cámara anecóica) utilizando arrays virtuales. El estudio analiza la influencia del tamaño del array y el diagrama de radiación en la estimación de los parámetros de localización proponiendo, para ello, un modelo de señal basado en un vector de enfoque de onda esférico (SWSV). Al aumentar el número de antenas del array se consigue reducir el error RMS de estimación y mejorar sustancialmente la representación espacial del canal. La estimación de los parámetros de localización se lleva a cabo con un nuevo método de búsqueda multinivel adaptativo, propuesto con el fin de reducir drásticamente el tiempo de procesado que demandan otros algoritmos multivariable basados en subespacios, como el MUSIC, a costa de incrementar los requisitos de memoria. Las simulaciones del modelo arrojan resultados que son validados con resultados experimentales y comparados con el límite de Cramer Rao en términos del error cuadrático medio. La compensación del diagrama de radiación acerca sustancialmente la exactitud de estimación de la distancia al límite de Cramer Rao. Finalmente, es igual de importante la evaluación teórica como experimental de las prestaciones de los sistemas MIMO-OFDM. Por ello, se presenta el diseño e implementación de un prototipo de medida MIMO-OFDM-SPAA3D autocalibrado con sistema de posicionamiento de antena automático en la banda de 2,45 Ghz con capacidad para evaluar la capacidad de los sistemas MIMO. Además, tiene la capacidad de caracterizar espacialmente canales MIMO, incorporando para ello una etapa de autocalibración para medir la respuesta en frecuencia de los transmisores y receptores de RF, y así poder caracterizar la respuesta de fase del canal con mayor precisión. Este sistema incorpora un posicionador de antena automático 3D (SPAA3D) basado en un scanner con 3 brazos mecánicos sobre los que se desplaza un posicionador de antena de forma independiente, controlado desde un PC. Este posicionador permite obtener una gran cantidad de mediciones del canal en regiones locales, lo cual favorece la caracterización estadística de los parámetros del sistema MIMO. Con este prototipo se realizan varias campañas de medida para evaluar el canal MIMO en términos de capacidad comparando 2 esquemas de polarización y tomando en cuenta la diversidad en frecuencia aportada por la modulación OFDM en distintos escenarios. ABSTRACT Multiple-antennas technologies have been evolved to be the support of the actual and future wireless communication systems in its way to provide the high quality and high data rates required by new data, voice and data services. However, it is important to understand the behavior of the spatial characteristics of the radio channel, since the channel by itself limits the performance of the actual wireless communications systems. This drawback raises the need to understand the spatial structure of the propagation channel in order to design, assess, and develop more efficient multiantenna technologies for the actual and future wireless communications systems. Multiantenna technologies such as ‘Smart Antennas’ and MIMO systems have generated great interest in the field of wireless communications, i.e. cellular communications systems and more recently WLAN (Wireless Local Area Networks), mainly because the higher quality and the high data rate they are able to provide. Their technological benefits are based on the exploitation of the spatial diversity provided by the use of multiple antennas as happened in the past with some multiaccess technologies such as FDMA (Frequency Division Multiplexing Access), TDMA (Time Division Multiplexing Access), and CDMA (Code Division Multiplexing Access), which give diversity in the domains of frequency, time and code, respectively. This Thesis is mainly focus to study the spatial channel characteristics using schemes of multiple antennas considering several diversity schemes such as space, polarization, and frequency. The spatial characteristics will be study in terms of the direction-of-arrival profiles viewed at the receiver side of the radio link. The first step is to do a review of the smart antennas and MIMO systems technologies highlighting their advantages and drawbacks from a mathematical point of view. In the second step, a set of studies concerning the spatial characterization of the radio channel through the DoA profiles are addressed. The performance of several DoA estimation methods is assessed considering several aspects regarding antenna array structure, polarization diversity, and far-field and near-field conditions. Most of the results of these studies come from simulations of data models and measurements with real multiantena prototypes. In the same way, having understand the importance of validate the theoretical data models with experimental results, a 2,4 GHz MIMO-OFDM-SPAA2D prototype is presented. This prototype is intended for evaluating MIMO-OFDM capacity in indoor and outdoor scenarios, characterize the spatial structure of radio channels, assess several diversity schemes such as polarization, space, and frequency diversity, among others aspects. The studies reported are briefly described below. As is stated in Chapter two, the determination of user position is a fundamental task to be resolved for the smart antenna systems. As these systems are equipped with antenna arrays, they can provide the enough spatial diversity to accurately draw the spatial characterization of the radio channel through the DoA profiles, and therefore the source location. However, certain real implementation factors related to antenna errors, signals, and receivers will certainly reduce the performance of such direction finding systems. In that sense, a parameterized narrowband signal model is proposed to evaluate the influence of these factors in the location parameter estimation through extensive MC simulations. The results obtained from several DoA algorithms may be useful to extract some parameter design for directing finding systems based on arrays. The second study goes through the importance that polarization schemes can have for estimating far-field DoA profiles in radio channels, particularly for scenarios that may introduce polarization losses. For this purpose, a narrowband signal model with polarization sensibility is developed to conduct an analysis of several polarization schemes at transmitter (TX) and receiver (RX) through extensive MC simulations. In addition, spatial characterization of quasistatic indoor scenarios is also carried out using a 2.45 GHz MIMO prototype equipped with single and dual-polarized antennas. A good agreement between the measured DoA profiles with the propagation scenario is achieved. The theoretical and experimental evaluation of polarization schemes is performed using virtual arrays. In that case, a DoA estimation method is proposed based on adding an phase reference to properly track the DoA, which shows good results. In the third study, the special case of near-field source localization with virtual arrays is addressed. Most of DoA estimation algorithms are focused in far-field source localization where the radiated wavefronts are assume to be planar waves at the receive array. However, when source are located close to the array, the assumption of plane waves is no longer valid as the wavefronts exhibit a spherical behavior along the array. Thus, a faster and effective method of azimuth, elevation angles-of-arrival, and range estimation for near-field sources is proposed. The efficacy of the proposed method is evaluated with simulation and validated with measurements collected from a measurement campaign carried out in a controlled propagation environment, i.e. anechoic chamber. Moreover, the performance of the method is assessed in terms of the RMSE for several array sizes, several source positions, and taking into account the effect of radiation pattern. In general, better results are obtained with larger array and larger source distances. The effect of the antennas is included in the data model leading to more accurate results, particularly for range rather than for angle estimation. Moreover, a new multivariable searching method based on the MUSIC algorithm, called MUSA (multilevel MUSIC-based algorithm), is presented. This method is proposed to estimate the 3D location parameters in a faster way than other multivariable algorithms, such as MUSIC algorithm, at the cost of increasing the memory size. Finally, in the last chapter, a MIMO-OFDM-SPAA3D prototype is presented to experimentally evaluate different MIMO schemes regarding antennas, polarization, and frequency in different indoor and outdoor scenarios. The prototype has been developed on a Software-Defined Radio (SDR) platform. It allows taking measurements where future wireless systems will be developed. The novelty of this prototype is concerning the following 2 subsystems. The first one is the tridimensional (3D) antenna positioning system (SPAA3D) based on three linear scanners which is developed for making automatic testing possible reducing errors of the antenna array positioning. A set of software has been developed for research works such as MIMO channel characterization, MIMO capacity, OFDM synchronization, and so on. The second subsystem is the RF autocalibration module at the TX and RX. This subsystem allows to properly tracking the spatial structure of indoor and outdoor channels in terms of DoA profiles. Some results are draw regarding performance of MIMO-OFDM systems with different polarization schemes and different propagation environments.
Resumo:
El software se ha convertido en el eje central del mundo actual, una compleja creación humana que influye en la vida, negocios y comunicación de todas las personas pertenecientes a la Sociedad de la Información. El rápido crecimiento experimentado en el ámbito del desarrollo software ha permitido la creación de avanzadas estructuras tecnológicas, denominadas “Sistemas Intensivos Software”, capaces de comunicarse con otros sistemas, dispositivos, sensores y personas. A lo largo de los próximos años los sistemas se enfrentarán a una mayor complejidad, surgida de la necesidad de operar en entornos de grandes dimensiones y de comportamientos no deterministas. Los métodos y herramientas actuales no son lo suficientemente potentes para diseñar, construir,implementar y mantener sistemas intensivos software con estas características, y detener la construcción de sistemas intensivos software o construir sistemas poco flexibles o fiables no es una alternativa real. En el desarrollo de “Sistemas Intensivos Software” pueden llegar a intervenir distintas entidades o compañías software que suelen estar en ubicaciones geográficas distintas y constituidas por grandes equipos de desarrollo, multidisciplinares e incluso multilingües. Debido a la criticidad del resultado de las actividades realizadas de forma independiente en el sistema resultante, éstas se han de controlar y monitorizar para asegurar la correcta integración de todos los elementos del sistema completo. El objetivo de este proyecto es la creación de una herramienta software para dar soporte a la gestión y monitorización de la construcción e integración de sistemas intensivos software, siendo extensible también a proyectos de otra índole. La herramienta resultante se denomina Positioning System, una aplicación web del tipo SPA (Single Page Application) creada con tecnología de última generación como el framework JavaScript AngularJS y tecnología de back-end como SlimPHP. Positioning System provee la funcionalidad necesaria para la creación de proyectos, familias y subfamilias de productos que constituyen los productos software de los proyectos creados, así como la gestión de socios comerciales y gestión de contactos de dichos proyectos. Todas estas funcionalidades son fácilmente monitorizadas y controladas por gráficos estadísticos generados para cada proyecto. ABSTRACT Software has become the backbone of today’s world, a complex human creation that has an important impact in the life, business and communication of all people involved with the Information Society. The quick growth that software development has undergone for last years has enabled the creation of advanced technological structures called “Software Intensive Systems”. They are able to communicate with other systems, devices, sensors and people. Next years, systems will face more complexity. It arises from the need of operating systems of large dimensions with non-deterministic behaviors. Current methods and tools are not powerful enough to design, build, implement and maintain software intensive systems; however stopping the development or developing unreliable and non-flexible systems is not a real alternative. Software Intensive Systems” development may involve different entities or software companies which may be in different geographical locations and may be constituted by large, multidisciplinary and even multilingual development teams. Due to the criticality of the result of each conducted activity, independently in the resulting system, these activities must be controlled and monitored to ensure the proper integration of all the elements within the complete system. The goal of this project is the creation of a software tool to support the management and monitoring of the construction and integration of software intensive systems, being possible to be extended to other kind of projects. The resultant tool is called Positioning System, a web application that follows the SPA (Single Page Application) style. It was created with the latest technologies, such as, the AngularJS framework and SlimPHP. The Positioning System provides the necessary features for the creation of projects, families and subfamilies of products that constitute the software products of the created projects, as well as the management of business partners and contacts of these projects. All these features are easily monitored and controlled by statistical graphs generated for each project.
Resumo:
La evolución de los teléfonos móviles inteligentes, dotados de cámaras digitales, está provocando una creciente demanda de aplicaciones cada vez más complejas que necesitan algoritmos de visión artificial en tiempo real; puesto que el tamaño de las señales de vídeo no hace sino aumentar y en cambio el rendimiento de los procesadores de un solo núcleo se ha estancado, los nuevos algoritmos que se diseñen para visión artificial han de ser paralelos para poder ejecutarse en múltiples procesadores y ser computacionalmente escalables. Una de las clases de procesadores más interesantes en la actualidad se encuentra en las tarjetas gráficas (GPU), que son dispositivos que ofrecen un alto grado de paralelismo, un excelente rendimiento numérico y una creciente versatilidad, lo que los hace interesantes para llevar a cabo computación científica. En esta tesis se exploran dos aplicaciones de visión artificial que revisten una gran complejidad computacional y no pueden ser ejecutadas en tiempo real empleando procesadores tradicionales. En cambio, como se demuestra en esta tesis, la paralelización de las distintas subtareas y su implementación sobre una GPU arrojan los resultados deseados de ejecución con tasas de refresco interactivas. Asimismo, se propone una técnica para la evaluación rápida de funciones de complejidad arbitraria especialmente indicada para su uso en una GPU. En primer lugar se estudia la aplicación de técnicas de síntesis de imágenes virtuales a partir de únicamente dos cámaras lejanas y no paralelas—en contraste con la configuración habitual en TV 3D de cámaras cercanas y paralelas—con información de color y profundidad. Empleando filtros de mediana modificados para la elaboración de un mapa de profundidad virtual y proyecciones inversas, se comprueba que estas técnicas son adecuadas para una libre elección del punto de vista. Además, se demuestra que la codificación de la información de profundidad con respecto a un sistema de referencia global es sumamente perjudicial y debería ser evitada. Por otro lado se propone un sistema de detección de objetos móviles basado en técnicas de estimación de densidad con funciones locales. Este tipo de técnicas es muy adecuada para el modelado de escenas complejas con fondos multimodales, pero ha recibido poco uso debido a su gran complejidad computacional. El sistema propuesto, implementado en tiempo real sobre una GPU, incluye propuestas para la estimación dinámica de los anchos de banda de las funciones locales, actualización selectiva del modelo de fondo, actualización de la posición de las muestras de referencia del modelo de primer plano empleando un filtro de partículas multirregión y selección automática de regiones de interés para reducir el coste computacional. Los resultados, evaluados sobre diversas bases de datos y comparados con otros algoritmos del estado del arte, demuestran la gran versatilidad y calidad de la propuesta. Finalmente se propone un método para la aproximación de funciones arbitrarias empleando funciones continuas lineales a tramos, especialmente indicada para su implementación en una GPU mediante el uso de las unidades de filtraje de texturas, normalmente no utilizadas para cómputo numérico. La propuesta incluye un riguroso análisis matemático del error cometido en la aproximación en función del número de muestras empleadas, así como un método para la obtención de una partición cuasióptima del dominio de la función para minimizar el error. ABSTRACT The evolution of smartphones, all equipped with digital cameras, is driving a growing demand for ever more complex applications that need to rely on real-time computer vision algorithms. However, video signals are only increasing in size, whereas the performance of single-core processors has somewhat stagnated in the past few years. Consequently, new computer vision algorithms will need to be parallel to run on multiple processors and be computationally scalable. One of the most promising classes of processors nowadays can be found in graphics processing units (GPU). These are devices offering a high parallelism degree, excellent numerical performance and increasing versatility, which makes them interesting to run scientific computations. In this thesis, we explore two computer vision applications with a high computational complexity that precludes them from running in real time on traditional uniprocessors. However, we show that by parallelizing subtasks and implementing them on a GPU, both applications attain their goals of running at interactive frame rates. In addition, we propose a technique for fast evaluation of arbitrarily complex functions, specially designed for GPU implementation. First, we explore the application of depth-image–based rendering techniques to the unusual configuration of two convergent, wide baseline cameras, in contrast to the usual configuration used in 3D TV, which are narrow baseline, parallel cameras. By using a backward mapping approach with a depth inpainting scheme based on median filters, we show that these techniques are adequate for free viewpoint video applications. In addition, we show that referring depth information to a global reference system is ill-advised and should be avoided. Then, we propose a background subtraction system based on kernel density estimation techniques. These techniques are very adequate for modelling complex scenes featuring multimodal backgrounds, but have not been so popular due to their huge computational and memory complexity. The proposed system, implemented in real time on a GPU, features novel proposals for dynamic kernel bandwidth estimation for the background model, selective update of the background model, update of the position of reference samples of the foreground model using a multi-region particle filter, and automatic selection of regions of interest to reduce computational cost. The results, evaluated on several databases and compared to other state-of-the-art algorithms, demonstrate the high quality and versatility of our proposal. Finally, we propose a general method for the approximation of arbitrarily complex functions using continuous piecewise linear functions, specially formulated for GPU implementation by leveraging their texture filtering units, normally unused for numerical computation. Our proposal features a rigorous mathematical analysis of the approximation error in function of the number of samples, as well as a method to obtain a suboptimal partition of the domain of the function to minimize approximation error.
Resumo:
A prática do ioga tem se tornado cada vez mais popular, não apenas pelos benefícios físicos, mas principalmente pelo bem-estar psicológico trazido pela sua prática. Um dos componentes do ioga é o Prãnãyama, ou controle da respiração. A atenção e a respiração são dois mecanismos fisiológicos e involuntários requeridos para a execução do Prãnãyama. O principal objetivo desse estudo foi verificar se variáveis contínuas do EEG (potência de diferentes faixas que o compõem) seriam moduladas pelo controle respiratório, comparando-se separadamente as duas fases do ciclo respiratório (inspiração e expiração), na situação de respiração espontânea e controlada. Fizeram parte do estudo 19 sujeitos (7 homens/12 mulheres, idade média de 36,89 e DP = ± 14,46) que foram convidados a participar da pesquisa nas dependências da Faculdade de Saúde da Universidade Metodista de São Paulo. Para o registro do eletroencefalograma foi utilizado um sistema de posicionamento de cinco eletrodos Ag AgCl (FPz, Fz, Cz, Pz e Oz) fixados a uma touca de posicionamento rápido (Quick-Cap, Neuromedical Supplies®), em sistema 10-20. Foram obtidos valores de máxima amplitude de potência (espectro de potência no domínio da frequência) nas frequências teta, alfa e beta e delta e calculada a razão teta/beta nas diferentes fases do ciclo respiratório (inspiração e expiração), separadamente, nas condições de respiração espontânea e de controle respiratório. Para o registro do ciclo respiratório, foi utilizada uma cinta de esforço respiratório M01 (Pletismógrafo). Os resultados mostram diferenças significativas entre as condições de respiração espontânea e de controle com valores das médias da razão teta/beta menores na respiração controlada do que na respiração espontânea e valores de média da potência alfa sempre maiores no controle respiratório. Diferenças significativas foram encontradas na comparação entre inspiração e expiração da respiração controlada com diminuição dos valores das médias da razão teta/beta na inspiração e aumento nos valores das médias da potência alfa, sobretudo na expiração. Os achados deste estudo trazem evidências de que o controle respiratório modula variáveis eletrofisiológicas relativas à atenção refletindo um estado de alerta, porém mais relaxado do que na situação de respiração espontânea.
Resumo:
Hoje em dia com o crescente aumento da exploração de petróleo e gás em águas profundas, há um aumento na demanda por operações offshore envolvendo a cooperação entre unidades flutuantes. Tais operações requerem um alto nível de planejamento e coordenação, o que na maioria dos casos é feito com a troca de informação no nível de operação, com cada unidade flutuante comandada independentemente. Exemplos de operações deste tipo vão desde operações de alívio passando por operações de instalação de equipamento submarino, até operações de pesquisa envolvendo múltiplas unidades flutuantes dotadas de sistema de posicionamento dinâmico (DP). As vantagens do controle cooperativo surgem com a redução do erro da distância relativa durante a manutenção do posicionamento ou durante a execução de manobras de posicionamento conjuntas. No presente trabalho, os conceitos de controle de consenso são aplicados de forma combinada com o sistema DP de cada navio. A influência dos ganhos do controlador cooperativo no sistema como um todo será discutida, utilizando-se técnicas de análise da resposta em frequência. Simulações completas no domínio do tempo e experimentos usando modelos em escala serão utilizados para se demonstrar o funcionamento do controle cooperativo. Todas as simulações serão conduzidas no simulador Dynasim e os ensaios experimentais no tanque de provas da Engenharia Naval da Escola Politécnica da Universidade de São Paulo. Além disso, serão feitas comparações entre os experimentos em tanque de provas e simulações numéricas equivalentes, demonstrando-se a validade dos ensaios numéricos. Será também demonstrado que os requisitos de projetos adotados são atendidos pelos ensaios em tanque de provas. .
Resumo:
O conceito de controle híbrido é aplicado à operação de alívio entre um FPWSO e um navio aliviador. Ambos os navios mantêm suas posições e aproamentos pelo resultado da ação do seu Sistema de Posicionamento Dinâmico (SPD). O alívio dura cerca de 24 horas para ser concluído. Durante este período, o estado de mar pode se alterar e os calados estão sendo constantemente alterados. Um controlador híbrido é projetado para permitir modificacões dos parâmetros de controle/observação se alguma alteração significante do estado de mar e/ou calado das embarcações ocorrer. O principal objetivo dos controladores é manter o posicionamento relativo entre os navios com o intuito de evitar perigosa proximidade ou excesso de tensão no cabo. Com isto em mente, uma nova estratégia de controle que atue integradamente em ambos os navios é proposta baseda em geometria diferencial. Observadores não lineares baseados em passividade são aplicados para estimar a posição, a velocidade e as forças externas de mares calmos até extremos. O critério para troca do controle/observação é baseado na variação do calado e no estado de mar. O calado é assumido conhecido e o estado de mar é estimado pela frequência de pico do espectro do movimento de primeira ordem dos navios. Um modelo de perturbação é proposto para encontrar o número de controladores do sistema híbrido. A equivalência entre o controle geométrico e o controlador baseado em Multiplicadores de Lagrange é demonstrada. Assumindo algumas hipóteses, a equivalência entre os controladores geométrico e o PD é também apresentada. O desempenho da nova estratégia é avaliada por meio de simulações numéricas e comparada a um controlador PD. Os resultados apresentam muito bom desempenho em função do objetivo proposto. A comparação entre a abordagem geométrica e o controlador PD aponta um desempenho muito parecido entre eles.
Resumo:
Following the recent ‘third plenum’ in China, CEPS Director Daniel Gros finds that China has reached a difficult crossroads in terms of making the necessary reforms that will foster continued growth and productivity. Continuing in the direction that so far has been followed with astounding success, namely giving the market a greater role and opening to the rest of the world, might no longer be sufficient. He points out, for example, that combating pollution requires more state intervention, not less. And similarly, strengthening a huge, potentially unstable, financial system requires stronger oversight and some continuing separation from the global financial system. Navigating this change in the right direction will be crucial not only for China, but also for the global economy.
Resumo:
The European market for asset-backed securities (ABS) has all but closed for business since the start of the economic and financial crisis. ABS (see Box 1) were in fact the first financial assets hit at the onset of the crisis in 2008. The subprime mortgage meltdown caused a deterioration in the quality of collateral in the ABS market in the United States, which in turn dried up overall liquidity because ABS AAA notes were popular collateral for inter-bank lending. The lack of demand for these products, together with the Great Recession in 2009, had a considerable negative impact on the European ABS market. The post-crisis regulatory environment has further undermined the market. The practice of slicing and dicing of loans into ABS packages was blamed for starting and spreading the crisis through the global financial system. Regulation in the post-crisis context has thus been relatively unfavourable to these types of instruments, with heightened capital requirements now necessary for the issuance of new ABS products. And yet policymakers have recently underlined the need to revitalise the ABS market as a tool to improve credit market conditions in the euro area and to enhance transmission of monetary policy. In particular, the European Central Bank and the Bank of England have jointly emphasised that: “a market for prudently designed ABS has the potential to improve the efficiency of resource allocation in the economy and to allow for better risk sharing... by transforming relatively illiquid assets into more liquid securities. These can then be sold to investors thereby allowing originators to obtain funding and, potentially, transfer part of the underlying risk, while investors in such securities can diversify their portfolios... . This can lead to lower costs of capital, higher economic growth and a broader distribution of risk” (ECB and Bank of England, 2014a). In addition, consideration has started to be given to the extent to which ABS products could become the target of explicit monetary policy operations, a line of action proposed by Claeys et al (2014). The ECB has officially announced the start of preparatory work related to possible outright purchases of selected ABS1. In this paper we discuss how a revamped market for corporate loans securitised via ABS products, and how use of ABS as a monetary policy instrument, can indeed play a role in revitalising Europe’s credit market. However, before using this instrument a number of issues should be addressed: First, the European ABS market has significantly contracted since the crisis. Hence it needs to be revamped through appropriate regulation if securitisation is to play a role in improving the efficiency of resource allocation in the economy. Second, even assuming that this market can expand again, the European ABS market is heterogeneous: lending criteria are different in different countries and banking institutions and the rating methodologies to assess the quality of the borrowers have to take these differences into account. One further element of differentiation is default law, which is specific to national jurisdictions in the euro area. Therefore, the pool of loans will not only be different in terms of the macro risks related to each country of origination (which is a ‘positive’ idiosyncratic risk, because it enables a portfolio manager to differentiate), but also in terms of the normative side, in case of default. The latter introduces uncertainties and inefficiencies in the ABS market that could create arbitrage opportunities. It is also unclear to what extent a direct purchase of these securities by the ECB might have an impact on the credit market. This will depend on, for example, the type of securities targeted in terms of the underlying assets that would be considered as eligible for inclusion (such as loans to small and medium-sized companies, car loans, leases, residential and commercial mortgages). The timing of a possible move by the ECB is also an issue; immediate action would take place in the context of relatively limited market volumes, while if the ECB waits, it might have access to a larger market, provided steps are taken in the next few months to revamp the market. We start by discussing the first of these issues – the size of the EU ABS market. We estimate how much this market could be worth if some specific measures are implemented. We then discuss the different options available to the ECB should they decide to intervene in the EU ABS market. We include a preliminary list of regulatory steps that could be taken to homogenise asset-backed securities in the euro area. We conclude with our recommended course of action.