30 resultados para Techno-fix

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has taken more than a decade of intense technical and market developments for mobile Internet to take off as a mass phenomenon. And it has arrived with great intensity: an avalanche of mobile content and applications is now overrunning us. Similar to its wired counterpart, wireless Web users will continuously demand access to data and content in an efficient and user-friendly manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the Digital Agenda for Europe released the Europe2020 flagship, Member States are looking for ways of fulfilling their agreed commitments to fast and ultrafast internet deployment. However, Europe is not a homogenous reality. The economic, geographic, social and demographic features of each country make it a highly diverse region to develop best practices over Next Generation Access Networks (NGAN) deployments. There are special concerns about NGAN deployments for “the final third”, as referred to the last 25% of the country’s population who, usually, live in rural areas. This paper assesses, through a techno-economic analysis, the access cost of providing over 30 Mbps broadband for the final third of Spain`s population in municipalities, which are classified into area types, referred to as geotypes. Fixed and mobile technologies are compared in order to determine which is the most cost-effective technology for each geotype. The demographic limit for fixed networks (cable, fibre and copper) is also discussed. The assessment focuses on the supply side and the results show the access network cost only. The research completes a previous published assessment (Techno-economic analysis of next generation access networks roll-out. The case of platform competition, regulation and public policy in Spain) by including the LTE scenario. The LTE scenario is dimensioned to provide 30 Mbps (best effort) broadband, considering a network take-up of 25%. The Rocket techno-economic model is used to assess a ten-year study period deployment. Nevertheless, the deployment must start in 2014 and be completed by 2020, in order to fulfil the Digital Agenda’s goals. The feasibility of the deployment is defined as the ability to recoup the investment at the end of the study period. This ability is highly related to network take-up and, therefore, to service adoption. Network deployment in each geotype is compared with the cost of the deployment in the Urban geotype and broadband expected penetration rates for clarity and simplicity. Debating the cost-effective deployments for each geotype, while addressing the Digital Agenda’s goals regarding fast and ultrafast internet, is the main purpose of this paper. At the end of the last year, the independent Spanish regulation agency released the Spain broadband coverage report at the first half of 2013. This document claimed that 59% and 52% of Spain’s population was already covered by NGAN capable of providing 30 Mbps and 100 Mbps broadband respectively. HFC, with 47% of population coverage, and FTTH, with 14%, were considered as a 100 Mbps capable NGAN. Meanwhile VDSL, with 12% of the population covered, was the only NGAN network considered for the 30 Mbps segment. Despite not being an NGAN, the 99% population coverage of HSPA networks was also noted in the report. Since mobile operators are also required to provide 30 Mbps broadband to 90% of the population in rural areas by the end of 2020, mobile networks will play a significant role on the achievement of the 30 Mbps goal in Spain’s final third. The assessment indicates the cost of the deployment per cumulative households coverage with 4 different NGANs: FTTH, HFC, VDSL and LTE. Research shows that an investment ranging from €2,700 (VDSL) to €5,400 (HFC) million will be needed to cover the first half of the population with any fixed technology assessed. The results state that at least €3,000 million will be required to cover these areas with the least expensive technology (LTE). However, if we consider the throughput that fixed networks could provide and achievement of the Digital Agenda’s objectives, fixed network deployments are recommended for up to 90% of the population. Fibre and cable deployments could cover up to a maximum of 88% of the Spanish population cost efficiently. As there are some concerns about the service adoption, we recommend VDSL and mobile network deployments for the final third of the population. Despite LTE being able to provide the most economical roll-out, VDSL could also provide 50 Mbps from 75% to 90% of the Spanish population cost efficiently. For this population gap, facility based competition between VDSL providers and LTE providers must be encouraged. Regarding 90% to 98.5% of the Spanish population, LTE deployment is the most appropriate. Since costumers in less populated the municipalities are more sensitive to the cost of the service, we consider that a single network deployment could be most appropriate. Finally, it has become clear that it is not possible to deliver 30Mbps to the final 1.5% of the population cost-efficiently and adoption predictions are not optimistic either. As there are other broadband alternatives able to deliver up to 20 Mbps, in the authors’ opinion, it is not necessary to cover the extreme rural areas, where public financing would be required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estudio tecno-económico acerca de la provisión de banda ancha móvil de 30 Mbps al tercio final de la población Española. Competencia entre plataformas e infraestructuras.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Towards a methodology for prospective deployment of ICT infrastructures - (Technologies & Architectures) - Key deployment parameters (network requirements) - User requirements - A proposal for Cost‐Benefit Analysis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El estándar LTE se ha posicionado como una de las claves para que los operadores de telecomunicación puedan abordar de manera eficiente en costes el crecimiento de la demanda de tráfico móvil que se prevé para los próximos años, al ser una tecnología más escalable en el núcleo de la red y más flexible en la interfaz radio que sus predecesoras. En este sentido, es necesario también que los reguladores garanticen un acceso al espectro radioeléctrico adecuado, equitativo y no discriminatorio, que permita un entorno estable para el despliegue de redes de comunicaciones móviles avanzadas. Además de la flexibilización del marco regulador del espectro radioeléctrico en Europa, que ha permitido el despliegue de nuevas tecnologías en las bandas de frecuencia históricas de GSM, se ha puesto a disposición espectro adicional para sistemas IMT en nuevas bandas de frecuencia, lo que ha planteando a su vez nuevos retos para la tecnología y la regulación. La fragmentación del espectro disponible para comunicaciones móviles ha impulsado el desarrollo de técnicas de agregación de portadoras en las nuevas versiones del estándar LTE, que permiten explotar mejor los recursos radio en su conjunto. No obstante, el espectro inferior a 1 GHz sigue siendo escaso, ya que el tráfico móvil aumenta y la banda de 900 MHz aún se utiliza para servicios GSM, lo que no ha conseguido sino agravar la disputa entre los servicios de radiodifusión terrestre y de comunicaciones móviles por la parte superior de la banda UHF. En concreto, la banda de 700 MHz se perfila como una de las próximas para aumentar el espectro disponible para los servicios en movilidad, si bien su liberación por parte de las actuales redes de Televisión Digital Terrestre presenta no pocas dificultades en los Estados miembros en los que ésta es la principal plataforma audiovisual de acceso gratuito, abriendo un debate sobre el modelo audiovisual a largo plazo en Europa. Por otro lado, las políticas públicas de promoción del acceso a la banda ancha rápida y ultrarrápida de la presente década han establecido objetivos ambiciosos para el año 2020, tanto en el ámbito europeo como en los diferentes Estados miembros. La universalización del acceso a redes de banda ancha de al menos 30 Mbps constituye uno de los principales retos. Las expectativas generadas por la tecnología LTE y la puesta a disposición de nuevas bandas de frecuencia hace posible que los servicios de acceso fijo inalámbrico adquieran especial relevancia ante los objetivos de política pública establecidos que, como ha sido reconocido en diversas ocasiones, no podrán lograrse sino con un compendio de diferente tecnologías. Para esta Tesis Doctoral se han desarrollado una serie modelos tecnoeconómicos con el objetivo de realizar un análisis prospectivo que evalúa tres casos de especial relevancia en el despliegue de redes LTE: en primer lugar, la valoración económica de la banda de 700 MHz; en segundo lugar, la evaluación de modelos de negocio y reducción de costes considerando tecnologías femtocelulares; y finalmente, la viabilidad de las redes LTE de acceso fijo inalámbrico para el cierre de la brecha digital en el acceso a la banda ancha de 30 Mbps. En relación con la aplicación del análisis tecnoeconómico para la valoración del espectro de 700 MHz, los resultados obtenidos ponen de manifiesto dos cuestiones fundamentales. En primer lugar, la necesidad de asignar a los operadores más espectro para satisfacer las previsiones de demanda de tráfico móvil a medio plazo. En segundo, existe una diferencia notable en los costes de despliegue de una red LTE cuando se dispone de espectro en frecuencias inferiores a 1 GHz y cuando no, pero esta diferencia de costes disminuye a medida que se añade nuevo espectro sub-1GHz. De esta manera, la atribución de la banda de 700 MHz a servicios de comunicaciones móviles supone una reducción relevante en los costes de despliegue si el operador no dispone de espectro en la banda de 800 MHz, pero no así si ya dispone de espectro en bandas bajas para el despliegue. En este sentido, puede concluirse que el precio que los operadores estarán dispuestos a pagar por el espectro de la banda de 700 MHz dependerá de si ya tienen disponible espectro en la banda de 800 MHz. Sin embargo, dado que la competencia por ese espectro será menor, los ingresos esperables en las licitaciones de esta nueva banda serán en general menores, a pesar de que para algunos operadores este espectro sería tan valioso como el de 800 MHz. En segundo lugar, en relación con el despliegue de femtoceldas pueden extraerse algunas conclusiones en términos de ahorro de costes de despliegue y también de cara a la viabilidad de los modelos de negocio que posibilitan. El ahorro que supone la introducción de femtoceldas en el despliegue de una red LTE frente al caso de un despliegue exclusivamente macrocelular se ha demostrado que es mayor cuanto menor es el ancho de banda disponible para la red macrocelular. En esta línea, para un operador convergente el despliegue de femtoceldas tiene sentido económico si el ancho de banda disponible es escaso (en torno a 2x10 MHz), que, en el caso de España, puede reflejar el caso de los operadores del segmento fijo que son nuevos entrantes en el móvil. Por otro lado, los modelos de acceso abierto son interesantes para operadores exclusivamente móviles, porque consiguen flexibilizar los costes sustituyendo estaciones base macrocelulares por el despliegue de femtoceldas, pero necesitan desplegarse en zonas con una densidad de población relativamente elevada para que éstas descarguen tráfico de varios usuarios de la red macrocelular simultáneamente. No obstante, las femtoceldas son beneficiosas en todo caso si es el usuario quien asume los costes de la femtocelda y el backhaul, lo que sólo parece probable si se integran en el modelo de negocio de comercialización de nuevos servicios. Por tanto, el despliegue de femtoceldas en buena parte de la casuística estudiada sólo tiene sentido si consiguen aumentar los ingresos por usuario comercializando servicios de valor añadido que necesiten calidad de servicio garantizada y exploten a la vez de esa forma su principal ventaja competitiva respecto a la tecnología WiFi. Finalmente, en relación con el papel de la tecnología LTE para la provisión de servicios de acceso fijo inalámbrico para la banda ancha de 30 Mbps, se ha desarrollado un modelo TD-LTE y mediante la metodología de análisis tecnoeconómico se ha realizado un estudio prospectivo para el caso de España. Los resultados obtenidos preciden una huella de cobertura de FTTH del 74% para 2020, y demuestran que una red TD-LTE en la banda de 3,5 GHz resulta viable para aumentar la cobertura de servicios de 30 Mbps en 14 puntos porcentuales. Junto con la consideración de la cobertura de otras redes, la cobertura de 30 Mbps de acuerdo a la viabilidad de los despliegues alcanzaría el 95% en España en el año 2020. Como resumen, los resultados obtenidos muestran en todos los casos la capacidad de la tecnología LTE para afrontar nuevos desafíos en relación con el aumento del tráfico móvil, especialmente crítico en las zonas más urbanas, y el cierre de la brecha digital en el acceso a la banda ancha rápida en las zonas más rurales. ABSTRACT The LTE standard has been pointed out as one of the keys for telecom operators to address the demand growth in mobile traffic foreseen for the next years in a cost-efficient way, since its core network is more scalable and its radio interface more flexible than those of its predecessor technologies. On the other hand, regulators need to guarantee an adequate, equitable and non-discriminatory access to radio spectrum, which enable a favorable environment for the deployment of advanced mobile communication networks. Despite the reform of the spectrum regulatory framework in Europe, which allowed for the deployment of new technologies in the historic GSM bands, additional spectrum has been allocated to IMT systems in new frequency bands, what in turn has set out new challenges for technology and regulation. The current fragmentation of available spectrum in very different frequency bands has boosted the development of carrier aggregation techniques in most recent releases of the LTE standard, which permit a better exploitation of radio resources as a whole. Nonetheless, spectrum below 1 GHz is still scarce for mobile networks, since mobile traffic increases at a more rapid pace than spectral efficiency and spectrum resources. The 900 MHz frequency band is still being used for GSM services, what has worsen the dispute between mobile communication services and terrestrial broadcasting services for the upper part of the UHF band. Concretely, the 700 MHz frequency band has been pointed out as one of the next bands to be allocated to mobile in order to increase available spectrum. However, its release by current Digital Terrestrial Television networks is challenging in Member States where it constitutes the main free access audiovisual platform, opening up a new debate around the audiovisual model in the long term in Europe. On the other hand, public policies of the present decade to promote fast and ultrafast broadband access has established very ambitious objectives for the year 2020, both at European and national levels. Universalization of 30 Mbps broadband access networks constitutes one of the main challenges. Expectations raised by LTE technology and the allocation of new frequency bands has lead fixed wireless access (FWA) services to acquire special relevance in light of public policy objectives, which will not be met but with a compendium of different technologies, as different involved stakeholders have acknowledged. This PhD Dissertation develops techno-economic models to carry out a prospective analysis for three cases of special relevance in LTE networks’ deployment: the spectrum pricing of the 700 MHz frequency band, an assessment of new business models and cost reduction considering femtocell technologies, and the feasibility of LTE fixed wireless access networks to close the 30 Mbps broadband access gap in rural areas. In the first place and regarding the application of techno-economic analysis for 700 MHz spectrum pricing, obtained results reveal two core issues. First of all, the need to allocate more spectrum for operators in order to fulfill mobile traffic demand in the mid-term. Secondly, there is a substantial difference in deployment costs for a LTE network when there is sub-1GHz spectrum available and when there is not, but this difference decreases as additional sub-1GHz spectrum is added. Thus, the allocation of 700 MHz band to mobile communication services would cause a relevant reduction in deployment costs if the operator does not count on spectrum in the 800 MHz, but not if it already has been assigned spectrum in low frequencies for the deployment. In this regard, the price operators will be willing to pay for 700 MHz spectrum will depend on them having already spectrum in the 800 MHz frequency band or not. However, since competition for the new spectrum will not be so strong, expected incomes from 700 MHz spectrum awards will be generally lower than those from the digital dividend, despite this spectrum being as valuable as 800 MHz spectrum for some operators. In the second place, regarding femtocell deployment, some conclusions can be drawn in terms of deployment cost savings and also with reference to the business model they enable. Savings provided by a joint macro-femto LTE network as compared to an exclusively macrocellular deployment increase as the available bandwidth for the macrocells decreases. Therefore, for a convergent operator the deployment of femtocells can only have economic sense if the available bandwidth is scarce (around 2x10 MHz), which might be the case of fix market operators which are new entrant in mobile market. Besides, open access models are interesting for exclusively mobile operators, since they make costs more flexible by substituting macrocell base stations by femtocells, but they need to be deployed relatively densely populated areas so that they can offload traffic from several macrocell users simultaneously. Nonetheless, femtocells are beneficial in all cases if the user assumes both femtocell and backhaul costs, which only seems probable if they are integrated in a business model commercializing new services. Therefore, in many of the cases analyzed femtocell deployment only makes sense if they increase revenues per user through new added value services which need from guaranteed quality of service, thus exploiting its main competitive advantage compared to WiFi. Finally, regarding the role of LTE technology in the provision of fixed wireless access services for 30 Mbps broadband, a TD-LTE model has been developed and a prospective study has been carried out through techno-economic methodology for the Spanish case. Obtained results foresee a FTTH coverage footprint of 74% households for 2020, and prove that a TD-LTE network in the 3.5 GHz band results feasible to increase 30 Mbps service coverage in additional 14 percentage points. To sum up, obtained results show LTE technology capability to address new challenges regarding both mobile traffic growth, particularly critical in urban zones, and the current digital divide in fast broadband access in most rural zones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article analyses a number of social and cultural aspects of the blog phenomenon with the methodological aid of a complexity model, the New Techno-social Environment (hereinafter also referred to by its Spanish acronym, NET, or Nuevo Entorno Tecnosocial) together with the socio-technical approach of the two blogologist authors. Both authors are researchers interested in the new reality of the Digital Universal Network (DUN). After a review of some basic definitions, the article moves on to highlight some key characteristics of an emerging blog culture and relates them to the properties of the NET. Then, after a brief practical parenthesis for people entering the blogosphere for the first time, we present some reflections on blogs as an evolution of virtual communities and on the changes experienced by the inhabitants of the infocity emerging from within the NET. The article concludes with a somewhat disturbing question; whether among these changes there might not be a gradual transformation of the structure and form of human intelligence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Next generation access networks (NGAN) will support a renewed electronic communication market where main opportunities lie in the provision of ubiquitous broadband connectivity, applications and content. From their deployment it is expected a wealth of innovations. Within this framework, the project reviews the variety of NGAN deployment options available for rural environments, derives a simple method for approximate cost calculations, and then discusses and compares the results obtained. Data for Spain are used for practical calculations, but the model is applicable with minor modifications to most of the rural areas of European countries. The final part of the paper is devoted to review the techno-economic implications of a network deployment in a rural environment as well as the adequacy and possible developments of the regulatory framework involved

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Next generation access networks (NGAN) will support a renewed communication structure where opportunities lie in the provision of ubiquitous broadband connectivity, a wide variety of new applications, appealing contents and a general support to the sustainable growth of diverse sectors. From their deployment it is expected a wealth of innovations, jobs creation and a new wave of economic growth. In this paper we discuss which could be the role of Hybrid Fibre Coax (HFC) in the Next Generation Access Network (NGAN) roadmap. Thus, we propose a simplified model for making approximate cost calculations for HFC deployment based on the geographic and sociodemographic characteristics of Spain. Considering the latest evolution of HFC based on DOCSIS 3.0 from integrated (I-CMTS) towards modular (M-CMTS), the results from the model are compared with the most competitive NGAN for ultrabroadband speeds: Fibre to the Home (FTTH) based on Gigabitcapable Passive Optical Networks (GPON)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudio aborda la recopilación de nuevas tendencias del diseño sismorresistente, enfocándose en la técnica del aislamiento de base, por ser la más efectiva, difundida y utilizada; y el análisis de las ventajas que puede tener una edificación que aplica dicha técnica, desde el punto de vista estructural y económico. Se elige la tipología más frecuente o común de edificios de hormigón armado propensos a ser aislados, que en este caso es un hospital, cuyo modelo empotrado se somete a varias normas sismorresistentes comparando principalmente fuerzas de cortante basal, y considerando la interacción suelo-estructura; para asistir a este cálculo se desarrolla un programa de elementos viga de 6 gdl por nodo en código Matlab. El modelo aislado incluye el análisis de tres combinaciones de tipos de aisladores HDR, LPR y FPS, alternando modelos lineales simplificados de 1 y 3 gdl por piso, evaluando diferencias de respuestas de la estructura, y procediendo a la elección de la combinación que de resultados más convenientes; para la modelación no lineal de cada sistema de aislamiento se utiliza el método explícito de diferencias centrales. Finalmente, se realiza un análisis comparativo de daños esperados en el caso de la ocurrencia del sismo de diseño, utilizando el método rápido y tomando como referencia el desplazamiento espectral del último piso; llegando a dar conclusiones y recomendaciones para el uso de sistemas de aislamiento. This study addresses the collection of new seismic design trends, focusing on base isolation technique, as the most effective and widely used, and the analysis of the advantages in buildings that apply this technique, from the structurally and economically point of view. Choosing the most common types of concrete buildings likely to be isolated, which in this case is a hospital, the fix model is subjected to various seismic codes mainly comparing base shear forces, and considering the soil-structure interaction; for this calculation attend a program of bars 6 dof per node is made in Matlab code. The isolated model includes analysis of three types of isolators combinations HDR, LPR and FPS, alternating simplified linear model of 1 and 3 dof per floor, evaluating differences in the response of the structure, and proceeding to the choice of the combination of results more convenient; for modeling nonlinear each insulation system, the explicit central difference method is used. Finally, a comparative analysis of expected damage in the case of the design earthquake, using a fast combined method and by reference to the spectral displacement of the top floor; reaching conclusions and give recommendations for the use of insulation systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con el surgir de los problemas irresolubles de forma eficiente en tiempo polinomial en base al dato de entrada, surge la Computación Natural como alternativa a la computación clásica. En esta disciplina se trata de o bien utilizar la naturaleza como base de cómputo o bien, simular su comportamiento para obtener mejores soluciones a los problemas que los encontrados por la computación clásica. Dentro de la computación natural, y como una representación a nivel celular, surge la Computación con Membranas. La primera abstracción de las membranas que se encuentran en las células, da como resultado los P sistemas de transición. Estos sistemas, que podrían ser implementados en medios biológicos o electrónicos, son la base de estudio de esta Tesis. En primer lugar, se estudian las implementaciones que se han realizado, con el fin de centrarse en las implementaciones distribuidas, que son las que pueden aprovechar las características intrínsecas de paralelismo y no determinismo. Tras un correcto estudio del estado actual de las distintas etapas que engloban a la evolución del sistema, se concluye con que las distribuciones que buscan un equilibrio entre las dos etapas (aplicación y comunicación), son las que mejores resultados presentan. Para definir estas distribuciones, es necesario definir completamente el sistema, y cada una de las partes que influyen en su transición. Además de los trabajos de otros investigadores, y junto a ellos, se realizan variaciones a los proxies y arquitecturas de distribución, para tener completamente definidos el comportamiento dinámico de los P sistemas. A partir del conocimiento estático –configuración inicial– del P sistema, se pueden realizar distribuciones de membranas en los procesadores de un clúster para obtener buenos tiempos de evolución, con el fin de que la computación del P sistema sea realizada en el menor tiempo posible. Para realizar estas distribuciones, hay que tener presente las arquitecturas –o forma de conexión– de los procesadores del clúster. La existencia de 4 arquitecturas, hace que el proceso de distribución sea dependiente de la arquitectura a utilizar, y por tanto, aunque con significativas semejanzas, los algoritmos de distribución deben ser realizados también 4 veces. Aunque los propulsores de las arquitecturas han estudiado el tiempo óptimo de cada arquitectura, la inexistencia de distribuciones para estas arquitecturas ha llevado a que en esta Tesis se probaran las 4, hasta que sea posible determinar que en la práctica, ocurre lo mismo que en los estudios teóricos. Para realizar la distribución, no existe ningún algoritmo determinista que consiga una distribución que satisfaga las necesidades de la arquitectura para cualquier P sistema. Por ello, debido a la complejidad de dicho problema, se propone el uso de metaheurísticas de Computación Natural. En primer lugar, se propone utilizar Algoritmos Genéticos, ya que es posible realizar alguna distribución, y basada en la premisa de que con la evolución, los individuos mejoran, con la evolución de dichos algoritmos, las distribuciones también mejorarán obteniéndose tiempos cercanos al óptimo teórico. Para las arquitecturas que preservan la topología arbórea del P sistema, han sido necesarias realizar nuevas representaciones, y nuevos algoritmos de cruzamiento y mutación. A partir de un estudio más detallado de las membranas y las comunicaciones entre procesadores, se ha comprobado que los tiempos totales que se han utilizado para la distribución pueden ser mejorados e individualizados para cada membrana. Así, se han probado los mismos algoritmos, obteniendo otras distribuciones que mejoran los tiempos. De igual forma, se han planteado el uso de Optimización por Enjambres de Partículas y Evolución Gramatical con reescritura de gramáticas (variante de Evolución Gramatical que se presenta en esta Tesis), para resolver el mismo cometido, obteniendo otro tipo de distribuciones, y pudiendo realizar una comparativa de las arquitecturas. Por último, el uso de estimadores para el tiempo de aplicación y comunicación, y las variaciones en la topología de árbol de membranas que pueden producirse de forma no determinista con la evolución del P sistema, hace que se deba de monitorizar el mismo, y en caso necesario, realizar redistribuciones de membranas en procesadores, para seguir obteniendo tiempos de evolución razonables. Se explica, cómo, cuándo y dónde se deben realizar estas modificaciones y redistribuciones; y cómo es posible realizar este recálculo. Abstract Natural Computing is becoming a useful alternative to classical computational models since it its able to solve, in an efficient way, hard problems in polynomial time. This discipline is based on biological behaviour of living organisms, using nature as a basis of computation or simulating nature behaviour to obtain better solutions to problems solved by the classical computational models. Membrane Computing is a sub discipline of Natural Computing in which only the cellular representation and behaviour of nature is taken into account. Transition P Systems are the first abstract representation of membranes belonging to cells. These systems, which can be implemented in biological organisms or in electronic devices, are the main topic studied in this thesis. Implementations developed in this field so far have been studied, just to focus on distributed implementations. Such distributions are really important since they can exploit the intrinsic parallelism and non-determinism behaviour of living cells, only membranes in this case study. After a detailed survey of the current state of the art of membranes evolution and proposed algorithms, this work concludes that best results are obtained using an equal assignment of communication and rules application inside the Transition P System architecture. In order to define such optimal distribution, it is necessary to fully define the system, and each one of the elements that influence in its transition. Some changes have been made in the work of other authors: load distribution architectures, proxies definition, etc., in order to completely define the dynamic behaviour of the Transition P System. Starting from the static representation –initial configuration– of the Transition P System, distributions of membranes in several physical processors of a cluster is algorithmically done in order to get a better performance of evolution so that the computational complexity of the Transition P System is done in less time as possible. To build these distributions, the cluster architecture –or connection links– must be considered. The existence of 4 architectures, makes that the process of distribution depends on the chosen architecture, and therefore, although with significant similarities, the distribution algorithms must be implemented 4 times. Authors who proposed such architectures have studied the optimal time of each one. The non existence of membrane distributions for these architectures has led us to implement a dynamic distribution for the 4. Simulations performed in this work fix with the theoretical studies. There is not any deterministic algorithm that gets a distribution that meets the needs of the architecture for any Transition P System. Therefore, due to the complexity of the problem, the use of meta-heuristics of Natural Computing is proposed. First, Genetic Algorithm heuristic is proposed since it is possible to make a distribution based on the premise that along with evolution the individuals improve, and with the improvement of these individuals, also distributions enhance, obtaining complexity times close to theoretical optimum time. For architectures that preserve the tree topology of the Transition P System, it has been necessary to make new representations of individuals and new algorithms of crossover and mutation operations. From a more detailed study of the membranes and the communications among processors, it has been proof that the total time used for the distribution can be improved and individualized for each membrane. Thus, the same algorithms have been tested, obtaining other distributions that improve the complexity time. In the same way, using Particle Swarm Optimization and Grammatical Evolution by rewriting grammars (Grammatical Evolution variant presented in this thesis), to solve the same distribution task. New types of distributions have been obtained, and a comparison of such genetic and particle architectures has been done. Finally, the use of estimators for the time of rules application and communication, and variations in tree topology of membranes that can occur in a non-deterministic way with evolution of the Transition P System, has been done to monitor the system, and if necessary, perform a membrane redistribution on processors to obtain reasonable evolution time. How, when and where to make these changes and redistributions, and how it can perform this recalculation, is explained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Desde que el Hombre era morador de las cavernas ha sido manifiesto su deseo innato por grabar y reproducir "instantáneas con las que perpetuarse o sobre las que mirarse ". La aparición y desarrollo de la fotografía como medio para poder captar y fijar "la imagen directa de la realidad circundante " pronto se convierte en un nuevo lenguaje estético y poético que permite al artista la interpretación y reflexión de lo observado. Se imprime a la imagen el carácter de la mirada del fotógrafo, estableciendo un diálogo conceptual con el juego de luces. La presente Tesis plantea la creación de una nueva piel de arquitectura mediante la impresión fotográfica sobre materiales pétreos. La búsqueda de la expresividad de los materiales como soporte de expresión artística implica un cambio de escala al trasladar la instantánea fotográfica a la arquitectura y la aplicación de un nuevo soporte al imprimir la fotografía sobre materiales arquitectónicos. Se justifica la elección del dispositivo láser CO2 como sistema de impresión fotográfica sobre los materiales pétreos arquitectónicos, como la técnica que permite la unión física de la imagen y el proyecto arquitectónico, generando un valor añadido a través del arte de la fotografía. Se justifica la elección de los materiales investigados, Silestone® Blanco Zeus y GRC® con TX Active® Aria, de forma que la investigación de esta nueva piel de arquitectura abarca tanto la envolvente del edificio como su volumen interior, permitiendo cerrar el círculo arquitectónico "in&out" y dota al proyecto arquitectónico de un valor añadido al introducir conceptos sostenibles de carácter estético y medioambiental. Se realiza una consulta a las empresas del sector arquitectónico relacionadas directamente con la producción y distribución de los materiales Silestone® y GRC®, así como a las empresas especializadas en sistemas de impresión fotográfica sobre materiales, acerca del estado del arte. Se recorre la Historia de la fotografía desde sus orígenes hasta el desarrollo de la era digital y se analiza su condición artística. Se recopilan los sistemas de impresión fotográfica que han evolucionado en paralelo con los dispositivos de captura de la instantánea fotográfica y se describe en profundidad el sistema de impresión fotográfica mediante dispositivo láser CO2. Se describen los procesos de fabricación, las características técnicas, cualidades y aplicaciones de los materiales pétreos arquitectónicos Silestone® Blanco Zeus y GRC® con TX Active® Aria. Se explica la técnica utilizada para la captación de la imagen fotográfica, su justificación artística y su proceso de impresión mediante dispositivo láser CO2 bajo diferentes parámetros sobre muestras de los materiales arquitectónicos investigados. Se comprueba la viabilidad de desarrollo de la nueva piel de arquitectura sobre Silestone® Blanco Zeus y GRC® con TX Active® Aria sometiendo a las piezas impresas bajo diferentes parámetros a tres ensayos de laboratorio. En cada uno de ellos se concreta el objetivo y procedimiento del ensayo, la enumeración de las muestras ensayadas y los parámetros bajo los que han sido impresas, el análisis de los resultados del ensayo y las conclusiones del ensayo. Ensayo de amplitud térmica. Se determina el grado de afectación de las imágenes impresas bajo la acción de contrastes térmicos. Series de muestras de Silestone® Blanco Zeus y GRC® con TX Active® Aria impresas con láser CO2 se someten a ciclos de contraste frío-calor de 12 horas de duración para una amplitud térmica total de 102°C. Se realiza una toma sistemática de fotografías microscópicas con lupa de aumento de cada pieza antes y después de los ciclos frío-calor y la observación de las transformaciones que experimentan los materiales bajo la acción del láser CO2. Ensayo de exposición a la acción de la radiación ultravioleta (UV). Se determina el grado de afectación de las imágenes impresas al activar la capacidad autolimpiante de partículas orgánicas. Una serie de muestras de GRC® con TX Active® Aria impresa con láser CO2 se someten a ciclos de exposición de radiación ultravioleta de 26 horas de duración. Se somete la serie a un procedimiento de activación del aditivo TX Active®. Se simula la contaminación orgánica mediante la aplicación controlada de Rodamina B, tinte orgánico, y se simula la radiación UV mediante el empleo de una bombilla de emisión de rayos ultravioleta. Se realiza una toma sistemática de fotografías macroscópicas de la serie de muestras ensayadas: antes de aplicación de la Rodamina B, momento 00:00h, momento 04:00h y momento 26:00h del ensayo. Se procede a la descarga y análisis del histograma de las fotografías como registro de la actividad fotocatalítica. Ensayo de la capacidad autodescontaminante del GRC® con TX Active® impreso con láser CO2. Se comprueba si la capacidad autodescontaminante del GRC® con TX Active® se ve alterada como consecuencia de la impresión de la imagen fotográfica impresa con láser CO2. Serie de muestras de GRC® con TX Active® Aria impresa con láser CO2 se someten a test de capacidad autodescontaminante: atmósfera controlada y contaminada con óxidos de nitrógeno en los que se coloca cada pieza ensayada bajo la acción de una lámpara de emisión de radiación ultravioleta (UV). Se registra la actividad fotocatalítica en base a la variación de concentración de óxido de nitrógeno. Se recopila el análisis e interpretación de los resultados de los ensayos de laboratorio y se elaboran las conclusiones generales de la investigación. Se sintetizan las futuras líneas de investigación que, a partir de las investigaciones realizadas y de sus conclusiones generales, podrían desarrollarse en el ámbito de la impresión fotográfica sobre materiales arquitectónicos. Se describe el rendimiento tecnológico y artístico generado por las investigaciones previas que han dado origen y desarrollo a la Tesis Doctoral. ABSTRACT Since ancient time, humanity has been driven by an innate wish to reproduce and engrave "snapshots that could help to perpetúate or to look at one self". Photography's birth and its development as a mean to capture and fix "the direct image of the surrounding reality" quickly becomes a new aesthetical and poetical language allowing the artist to interpret and think over what has been observed. The photographer's eye is imprinted onto the image, and so the conceptual dialogue between the artist and the light beams begins. The current thesis suggests the creation of a new architectural skin through photography imprinting over stony materials. The search for material's expressiveness as a medium of artistic expression involves a change of scale as it transfers photographic snapshot into architecture and the use of a new photographic printing support over architectural materials. CO2 laser is the chosen printing system for this technique as it allows the physical union of the image and the architectonic project, generating an added value through the art of photography. The researched materials selected were Silestone®, Blanco Zeus and GRC® with TX Active® Aria. This new architectural skin contains the building surrounding as well as its interior volume, closing the architectonic "in & out" circle and adding a value to the project by introducing aesthetical and environmental sustainable concepts. Architecture companies related to the production and distribution of materials like Silestone® and GRC®, as well as companies specialized in photography printing over materials were consulted to obtain a State of the Art. A thorough analysis of photography's History from its origins to the digital era development was made and its artistic condition was studied in this thesis. In this study the author also makes a compilation of several photographic printing systems that evolved together with photographic snapshot devices. The CO2 laser-based photographic printing system is also described in depth. Regarding stony materials of architecture like Silestone®, Blanco Zeus and GRC® with TX Active® Aria, the present study also describes their manufacture processes as well as technical features, quality and application. There is also an explanation about the technique to capture the photographic image, its artistic justification and its CO2 laser-based printing system over the researched materials under different parameters. We also tested the feasibility of this new architectural skin over Silestone® Blanco Zeus and GRC® with TX Active® Aria. The pieces were tested under different parameters in three laboratory trials. Each trial comprises of an explanation of its objective and its process, the samples were numbered and the printing parameters were specified. Finally, with the analysis of the results some conclusions were drawn. In the thermal amplitude trial we tried to determine how printed images were affected as a result of the action of thermal contrasts. Series of samples of Silestone® Blanco Zeus and GRC® with TX Active® Aria printed with CO2 laser were subjected to several 12h warm-cold cycles for thermal total amplitude of 102oc. Each sample was captured systematically with microscopic enhanced lenses before and after cold-warm cycles. The changes experienced by these materials under the effect of CO2 laser were observed and recorded. Trial regarding the Ultraviolet Radiation (UR) effect on images. We determined to which extent printed images were affected once the self-cleaning organic particles were activated. This time GRC® with TX Active® Aria samples printed with CO2 laser were exposed to a 26h UR cycle. The samples were subjected to the activation of TX Active® additive. Through the controlled application of Rodamine B and organic dye we were able to simulate the organic contamination process. UR was simulated using an ultraviolet beam emission bulb. A systematic capture of macroscopic pictures of the tested sample series was performed at different time points: before Rodamine B application, at moment 00:00h, moment 04:00h and moment 26:00h of the trial. Picture's histogram was downloaded and analyzed as a log of photocatalytic activity. Trial regarding the self-decontaminating ability of GRC® with TX Active® printed with CO2 laser. We tested if this self-decontaminating ability is altered as a result of CO2 laser printed image. GRC® with TX Active® Aria samples printed with CO2 laser, were subject to self-decontaminating ability tests with controlled and nitrogen oxide contaminated atmosphere. Each piece was put under the action of an UR emission lamp. Photocatalytic activity was recorded according to the variation in nitrogen oxide concentration. The results of the trial and their interpretation as well as the general conclusions of the research are also compiled in the present study. Study conclusions enable to draw future research lines of potential applications of photographic printing over architecture materials. Previous research generated an artistic and technological outcome that led to the development of this doctoral thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A broadband primary standard for thermal noise measurements is presented and its thermal and electromagnetic behavior is analyzed by means of analytical and numerical simulation techniques. It consists of a broadband termination connected to a 3.5mm coaxial airline partially immersed in liquid Nitrogen. The main innovative part of the device is the thermal bead between inner and outer conductors, designed for obtaining a proper thermal contact and to keep low both its contribution to the total thermal noise and its reflectivity. A sensitivity analysis is realized in order to fix the manufacturing tolerances for a proper performance in the range 10MHz¿26.5GHz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mobile games are a prime example of a successful mobile application and demonstrate the increasing range of platforms for the media and entertainment industries. Against this convergent background, this paper introduces the basic features of the mobile gaming market and its industrial ecosystem together with its main actors and activities. The focus of the paper lies in the challenges ahead for the evolution of mobile applications into a potentially dominant game platform and the possible disruptions along this road. The deep personal relationships between users and their mobile devices are considered to further explore the link between mobile games, players’ strategies and pending techno-economic developments. The paper concludes with a brief discussion of some policy options to assist with the development of this domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most empirical disciplines promote the reuse and sharing of datasets, as it leads to greater possibility of replication. While this is increasingly the case in Empirical Software Engineering, some of the most popular bug-fix datasets are now known to be biased. This raises two significants concerns: first, that sample bias may lead to underperforming prediction models, and second, that the external validity of the studies based on biased datasets may be suspect. This issue has raised considerable consternation in the ESE literature in recent years. However, there is a confounding factor of these datasets that has not been examined carefully: size. Biased datasets are sampling only some of the data that could be sampled, and doing so in a biased fashion; but biased samples could be smaller, or larger. Smaller data sets in general provide less reliable bases for estimating models, and thus could lead to inferior model performance. In this setting, we ask the question, what affects performance more? bias, or size? We conduct a detailed, large-scale meta-analysis, using simulated datasets sampled with bias from a high-quality dataset which is relatively free of bias. Our results suggest that size always matters just as much bias direction, and in fact much more than bias direction when considering information-retrieval measures such as AUC and F-score. This indicates that at least for prediction models, even when dealing with sampling bias, simply finding larger samples can sometimes be sufficient. Our analysis also exposes the complexity of the bias issue, and raises further issues to be explored in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En los últimos años se ha construido un gran número de pasarelas peatonales como respuesta a la demanda de nuevas vías de paso en las ciudades. Estas estructuras tienen requisitos constructivos menos exigentes en comparación con otros tipos de puentes, lo cual ha facilitado el desarrollo de diseños con nuevos esquemas resistentes, complicadas geometrías y el empleo de nuevos materiales. En general estas estructuras son esbeltas, ligeras y poco amortiguadas, lo que en ocasiones ha generado problemas de vi-braciones al paso de peatones una vez puestas en servicio. Las normativas actuales son cada vez más sensibles a esta problemática, recomendando diseños cuyas frecuencias naturales deben estar alejadas de los rangos de frecuencia de paso típicos de los peatones y fijando límites de confort en forma de valores máximos de aceleración permitidos, asegurándose así un correcto comportamiento de la estructura. En el presente artículo se analiza esta problemática desde un punto de vista práctico. Para ello se muestran los puntos clave de las normativas y guías de diseño de pasarelas que se pueden encontrar actualmente en la bibliografía, se presentan las técnicas que habitualmente se emplean en el análisis dinámico experimental de estas estructuras, y se comentan las soluciones a las que generalmente se recurre para mejorar su comportamiento dinámico. Por último, se muestran los trabajos llevados a cabo por el Centro Tecnológico CARTIF en colaboración con las Universidades de Valladolid y Castilla-La Mancha en la pasarela peatonal del Museo de la Ciencia de Valladolid. Estos trabajos incluyen: (1) el estudio dinámico de los tres vanos metálicos de dicha pasarela, (2) el diseño e implementación de un amortiguador de masa sintonizado en el vano más sensible a las vibraciones, (3) la implementación de un amortiguador de masa activo utilizando un excitador electrodinámico, y (4) el desarrollo de pruebas para la verificación del estado de servicio de la pasarela. In the last years, a wide number of footbridges have been built as demand response of more direct pathways in cities. These structures have lower building requirements as compared with standard bridges. This circumstance has facilitated the development of new structural design with complex geometries and innovative materials. As a result, these structures may be slender, light and low damped, leading to vibration problems once in service. The current codes take into account this problem, and recommend designs with natural frequencies away from the typical pedestrian pacing rates and fix comfort limits to guarantee the serviceability of the structure.This paper studies this problem from a practical point of view. Thus, the key points of codes and footbridges guidelines are showed, the typical experimental dynamic analysis techniques are presented, and the usual solutions adopted to improve the dynamic performance of these structures are discussed. Finally, the works carried out on the Valladolid Science Museum Footbridge by Centro Tecnológico CARTIF in collaboration with the Universities of Valladolid and Castilla-La Mancha are showed. These works include: (1) the dynamic study of the three steel spans of the footbridge, (2) the design and implementation of a tuned mass damper in the liveliest span, (3) the implementation of an active mass damper using an electrodynamic shaker, and (4) the development of field tests to assess the serviceability of such span.