312 resultados para 3G


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continental and marine conditions during the last millennium off Porto, Portugal (the southern pole of the North Atlantic Oscillation, NAO), are reconstructed from a sediment archive through a high-resolution multiproxy study and instrumental evidence. Results show multidecadal variability and sea surface temperatures (SSTs) that correlate well with previously published land and sea-based Northern Hemisphere temperature records, and appear to be responding to long-term solar insolation variability. Precipitation was negatively correlated with the NAO, whereas strong flooding events occurred at times of marked climate cooling (AD 1100-1150 and 1400-1470) and transitions in solar activity. AD 1850 marks a major shift in the phytoplankton community associated with a decoupling of d18O records of 3 planktonic foraminifera species. These changes are interpreted as a response to a reduction in the summer and/or annual upwelling and more frequent fall-winter upwelling-like events. This shift's coincidence with a decrease in SST and the increase in coherence between our data and the Atlantic Multidecadal Oscillation (AMO) confirms the connection of the upwelling variability to the North Atlantic Ocean's surface and thermohaline circulation on a decadal scale. The disappearance of this agreement between the AMO and our records beyond AD 1850 and its coincidence with the beginning of the recent rise in atmospheric CO2 supports the hypothesis of a strong anthropogenic effect on the last ~150 yr of the climate record. Furthermore, it raises an important question of the use of instrumental records as the sole calibration data set for climate reconstructions, as these may not provide the best analogue for climate beyond AD 1730.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A high-resolution sedimentary sequence recovered from the Tagus prodelta has been studied with the objective to reconstruct multi-decadal to centennial-scale climate variability on the western Iberian Margin and to discuss the observations in a wider oceanographic and climatic context. Between ca. 100 BC and AD 400 the foraminiferal fauna and high abundance of Globorotalia inflata indicate advection of subtropical waters via the Azores Current and the winter-time warm Portugal Coastal Current. Between ca. AD 400 and 1350, encompassing the Medieval Climate Anomaly (MCA), enhanced upwelling is indicated by the planktonic foraminiferal fauna, in particular by the high abundance of upwelling indicator species Globigerina bulloides. Relatively light d18O values and high sea surface temperature (SST) (reconstructed from foraminiferal assemblages) point to upwelling of subtropical Eastern North Atlantic Central Water. Between ca. AD 1350 and 1750, i.e. most of the Little Ice Age, relatively heavy d18O values and low reconstructed SST, as well as high abundances of Neogloboquadrina incompta, indicate the advection of cold subpolar waters to the area and a southward deflection of the subpolar front in the North Atlantic, as well as changes in the mode of the North Atlantic Oscillation. In addition, the assemblage composition together with the other proxy data reveals less upwelling and stronger river input than during the MCA. Stronger Azores Current influence on the Iberian Margin and strong anthropogenic effect on the climate after AD 1750 is indicated by the foraminiferal fauna. The foraminiferal assemblage shows a significant change in surface water conditions at ca. AD 1900, including enhanced river runoff, a rapid increase in temperature and increased influence of the Azores Current. The Tagus record displays a high degree of similarity to other North Atlantic records, indicating that the site is influenced by atmospheric-oceanic processes operating throughout the North Atlantic, as well as by local changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea Surface Temperature (SST), river discharge and biological productivity have been reconstructed from a multi-proxy study of a high-temporal-resolution sedimentary sequence recovered from the Tagus deposition center off Lisbon (Portugal) for the last 2000 years. SST shows 2 °C variability on a century scale that allows the identification of the Medieval Warm Period (MWP) and the Little Ice Age (LIA). High Iron (Fe) and fine-sediment deposition accompanied by high n-alkane concentrations and presence of freshwater diatoms during the LIA (1300-1900 AD) (Science 292 (2001) 662) suggest augmented river discharge, whereas higher total-alkenone concentrations point to increased river-induced productivity. During the MWP (550-1300 AD) (Science 292 (2001) 662) larger mean-grain size and low values of magnetic susceptibility, and concentrations of Fe, n-alkanes, and n-alcohols are interpreted to reflect decreased runoff. At the same time, increased benthic and planktonic foraminifera abundances and presence of upwelling related diatoms point to increased oceanic productivity. On the basis of the excellent match found between the negative phases of the North Atlantic Oscillation (NAO) index and the intensified Tagus River discharge observed for the last century, it is hypothesized that the increased influx of terrigenous material during the LIA reflects a negative NAO-like state or the occurrence of frequent extreme NAO minima. During the milder few centuries of the MWP, stronger coastal upwelling conditions are attributed to a persistent, positive NAO-like state or the frequent occurrence of extreme NAO maxima. The peak in magnetic susceptibility, centered at 90 cm composite core depth (ccd), is interpreted as the result of the well-known 1755 AD Lisbon earthquake. The Lisbon earthquake and accompanying tsunami are estimated to have caused the loss of 39 cm of sediment (355 years of record-most of the LIA) and the instantaneous deposition of a 19-cm sediment bed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Chemical composition of water determines its physical properties and character of processes proceeding in it: freezing temperature, volume of evaporation, density, color, transparency, filtration capacity, etc. Presence of chemical elements in water solution confers waters special physical properties exerting significant influence on their circulation, creates necessary conditions for development and inhabitance of flora and fauna, and imparts to the ocean waters some chemical features that radically differ them from the land waters (Alekin & Liakhin, 1984). Hydrochemical information helps to determine elements of water circulation, convection depth, makes it easier to distinguish water masses and gives additional knowledge of climatic variability of ocean conditions. Hydrochemical information is a necessary part of biological research. Water chemical composition can be the governing characteristics determining possibility and limits of use of marine objects, both stationary and moving in sea water. Subject of investigation of hydrochemistry is study of dynamics of chemical composition, i.e. processes of its formation and hydrochemical conditions of water bodies (Alekin & Liakhin 1984). The hydrochemical processes in the Arctic Ocean are the least known. Some information on these processes can be obtained in odd publications. A generalizing study of hydrochemical conditions in the Arctic Ocean based on expeditions conducted in the years 1948-1975 has been carried out by Rusanov et al. (1979). The "Atlas of the World Ocean: the Arctic Ocean" contains a special section "Hydrochemistry" (Gorshkov, 1980). Typical vertical profiles, transects and maps for different depths - 0, 100, 300, 500, 1000, 2000, 3000 m are given in this section for the following parameters: dissolved oxygen, phosphate, silicate, pH and alkaline-chlorine coefficient. The maps were constructed using the data of expeditions conducted in the years 1948-1975. The illustrations reflect main features of distribution of the hydrochemical elements for multi-year period and represent a static image of hydrochemical conditions. Distribution of the hydrochemical elements on the ocean surface is given for two seasons - winter and summer, for the other depths are given mean annual fields. Aim of the present Atlas is description of hydrochemical conditions in the Arctic Ocean on the basis of a greater body of hydrochemical information for the years 1948-2000 and using the up-to-date methods of analysis and electronic forms of presentation of hydrochemical information. The most wide-spread characteristics determined in water samples were used as hydrochemical indices. They are: dissolved oxygen, phosphate, silicate, pH, total alkalinity, nitrite and nitrate. An important characteristics of water salt composition - "salinity" has been considered in the Oceanographic Atlas of the Arctic Ocean (1997, 1998). Presentation of the hydrochemical characteristics in this Hydrochemical Atlas is wider if compared with that of the former Atlas (Gorshkov, 1980). Maps of climatic distribution of the hydrochemical elements were constructed for all the standard depths, and seasonal variability of the hydrochemical parameters is given not only for the surface, but also for the underlying standard depths up to 400 m and including. Statistical characteristics of the hydrochemical elements are given for the first time. Detailed accuracy estimates of initial data and map construction are also given in the Atlas. Calculated values of mean-root deviations, maximum and minimum values of the parameters demonstrate limits of their variability for the analyzed period of observations. Therefore, not only investigations of chemical statics are summarized in the Atlas, but also some elements of chemical dynamics are demonstrated. Digital arrays of the hydrochemical elements obtained in nodes of a regular grid are the new form of characteristics presentation in the Atlas. It should be mentioned that the same grid and the same boxes were used in the Atlas, as those that had been used by creation of the US-Russian climatic Oceanographic Atlas. It allows to combine hydrochemical and oceanographic information of these Atlases. The first block of the digital arrays contains climatic characteristics calculated using direct observational data. These climatic characteristics were not calculated in the regions without observations, and the information arrays for these regions have gaps. The other block of climatic information in a gridded form was obtained with the help of objective analysis of observational data. Procedure of the objective analysis allowed us to obtain climatic estimates of the hydrochemical characteristics for the whole water area of the Arctic Ocean including the regions not covered by observations. Data of the objective analysis can be widely used, in particular, in hydrobiological investigations and in modeling of hydrochemical conditions of the Arctic Ocean. Array of initial measurements is a separate block. It includes all the available materials of hydrochemical observations in the form, as they were presented in different sources. While keeping in mind that this array contains some amount of perverted information, the authors of the Atlas assumed it necessary to store this information in its primary form. Methods of data quality control can be developed in future in the process of hydrochemical information accumulation. It can be supposed that attitude can vary in future to the data that were rejected according to the procedure accepted in the Atlas. The hydrochemical Atlas of the Arctic Ocean is the first specialized and electronic generalization of hydrochemical observations in the Arctic Ocean and finishes the program of joint efforts of Russian and US specialists in preparation of a number of atlases for the Arctic. The published Oceanographic Atlas (1997, 1998), Atlas of Arctic Meteorology and Climate (2000), Ice Atlas of the Arctic Ocean prepared for publication and Hydrochemical Atlas of the Arctic Ocean represent a united series of fundamental generalizations of empirical knowledge of Arctic Ocean nature at climatic level. The Hydrochemical Atlas of the Arctic Ocean was elaborated in the result of joint efforts of the SRC of the RF AARI and IARC. Dr. Ye. Nikiforov was scientific supervisor of the Atlas, Dr. R. Colony was manager on behalf of the USA and Dr. L. Timokhov - on behalf of Russia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Actualmente los entornos de televisión manejan gran cantidad de señales procedentes de diferentes fuentes y suministradas en diferentes estándares: SD-SDI,HD-SDI, 3G, DVB-ASI, señales moduladas en banda L. En la gran mayoría de los centros de radiodifusión es necesario al menos en alguno de los pasos del transporte de señales o de su explotación, la replicación de las diferentes señales de entrada a sus diferentes destinos. En el siguiente PFC se tratará de dar a conocer, los principales sistemas que se emplean en los entornos de televisión para la distribución de las señales que se utilizan en los mismos. Los sistemas distributivos se encargan de distribuir y replicar la señal de entrada a ellos hacia diferentes destinos. La principal idea con la que se diseñan estos sistemas es que la señal de entrada no sufra ningún tipo de degradación como fruto de la replicación o división, siendo incluso aconsejable que determinados sistemas regeneren la señal de entrada a sus salidas. El proyecto se divide en tres capítulos: - Capítulo de conceptos teóricos: En el que se presentarán, los fundamentos teóricos que justifican la tecnología utilizada por los diferentes sistemas. - Capítulo de distribución de señales SDI y ASI: En el cual se presentarán los principales sistemas utilizados, para la distribución de señales SD-SDI, HDSDI, 3G y DVB-ASI. - Capítulo de distribución de señales en banda-L: Donde se mostrarán, los principales sistemas de distribución de señales moduladas en banda-L. Al finalizar la lectura del mismo, el lector debería conocer los principales sistemas que se utilizan en la distribución de señales SDI y ASI y en la distribución de señales en banda L. Comprender mediante la teoría expuesta, y los sistemas presentados, las diferentes problemáticas que afrontan los diferentes equipos y la tecnología que subyace en un segundo plano para solventar las problemáticas. Nowadays, broadcast environments, manage an extensive amount of signals from several sources, using differents standards: SD-SDI, HD-SDI, 3G, DVBASI, modulated L band signals. In most of the broadcast infraestructures, it´s usually needed, at any of the transmissions steps, to replicate the source signal to many destinations. This PFC is intended to shown the main systems used to distribute and replicate signals in broadcast environments. Distributive Systems are used to distribute and replicate the source signals to as many destinations as are needed. The main idea behind the design of those systems is to preserve the integrity of the original signal, due to replications or division classical pathologies. Even it is recomendad that those systems are able to regenerate the original signal, trying to avoid or compesate those pathologies. The PFC is divided in three chapters: - Chapter 1: Theorical concepts: In this chapter is presented the minimal theory needed to understand the complex of the tecnology, for those distributive systems. - Chapter 2: Distribution of SDI and ASI signals: It is shown the main systems used to replicate and distribution of the SDI and ASI signals. It will also treatment the state of the art of the actually new systems. - Chapter 3: Distribution of L band signals. It will be shown the main systems used for L band signals At the end of reading this PFC, the reader must known the main systems used for distribution SDI and ASI signals, also L band signals. It is important to understand the contents of the PFC, the systems, the several problems and the background tecnology that is needed to afront the replication known patologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing availability (andaffordability) of mobile broadband - In 2015 half of the subscriber base will be in 3G/4G, and 80% in 2020 (27% in 2011) - 7.6 billion mobile users by 2020 (5.4 billion in 2011). Mobile subscribers per 100 inhabitants:99%. Increasing availability (and affordability) of smartphones - In 2020 81% of phones sold globally will be smartphones (2.5 billion) from 26% in 2011 (400 million) - 595 million tablets in 2020 (70 million in 2011)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing availability (andaffordability) of mobile broadband - In 2015 half of the subscriber base will be in 3G/4G, and 80% in 2020 (27% in 2011) - 7.6 billion mobile users by 2020 (5.4 billion in 2011). Mobile subscribers per 100 inhabitants:99%. Increasing availability (and affordability) of smartphones - In 2020 81% of phones sold globally will be smartphones (2.5 billion) from 26% in 2011 (400 million) - 595 million tablets in 2020 (70 million in 2011)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto, recoge el estudio de diferentes simuladores sobre comunicaciones móviles, que se encargan de analizar el comportamiento de las tecnologías UMTS (Universal Mobile Telecommunications System), 3G y LTE (Long Term Evolution),3.9G, centrándose principalmente en el caso de los simuladores LTE, ya que es la tecnología que se está implantando en la actualidad. Por ello, antes de analizar las características de la interfaz radio más importante de esta generación, la 3.9G, se hará una overview general de cómo han ido evolucionando las comunicaciones móviles a lo largo de la historia, se analizarán las características de la tecnología móvil actual, la 3.9G, para posteriormente centrarse en un par de simuladores que demostrarán, mediante resultados gráficos, estas características. Hoy en día, el uso de estos simuladores es totalmente necesario, ya que las comunicaciones móviles, avanzan a un ritmo vertiginoso y es necesario por lo tanto conocer las prestaciones que pueden producir las diferentes tecnologías móviles utilizadas. Los simuladores utilizados por este proyecto, permiten analizar el comportamiento de varios escenarios, ya que existen diferentes tipos de simuladores, tanto a nivel de enlace como a nivel de sistema. Se mencionarán una serie de simuladores correspondientes a la tercera generación UMTS, pero los simuladores en cuestión que se estudiarán y analizarán con más profundidad en este proyecto fin de carrera son los simuladores “Link-Level” y “System-Level”, desarrollados por el “Institute of Communications and Radio-Frecuency Engineering” de la Universidad de Viena. Estos simuladores permiten realizar diferentes simulaciones, como analizar el comportamiento entre una estación base y un único usuario, para el caso de los simuladores a nivel de enlace, o bien analizar el comportamiento de toda una red en el caso de los simuladores a nivel de sistema. Con los resultados que se pueden obtener de ambos simuladores, se realizarán una serie de preguntas, basadas en la práctica realizada por el profesor de la universidad Politécnica de Madrid, Pedro García del Pino, tanto de tipo teóricas como de tipo prácticas, para comprobar que se han entendido los simuladores analizados. Finalmente se citarán las conclusiones que se obtiene de este proyecto, así como las líneas futuras de acción. PROJECT ABSTRACT This project includes the study of different simulators on mobile communications, which are responsible for analyzing the behavior of UMTS (Universal Mobile Telecommunications System), 3G and LTE (Long Term Evolution), 3.9G, mainly focusing on the case of LTE simulators because it is the technology that is being implemented today. Therefore, before analyzing the characteristics of the most important radio interface of this generation, 3.9G, there will give a general overview how the mobile communications have evolved throughout history, analyzing the characteristics of current mobile technology, the 3.9G, later focus on a pair of simulators that demonstrate through graphical results, these characteristics. Today, the use of these simulators is absolutely necessary, because mobile communications advance at a high rate, and it is necessary to know the features that can produce different mobile technologies that are used. The simulators used for this project, allow to analyze the behavior of several scenarios, as there are different types of simulators, both link and system level. It mentioned a number of simulators for the third generation UMTS, but the simulators in question to be studied and analyzed in this final project are the simulators "Link-Level" and "System-Level", developed by the "Institute of Communications and Radio-Frequency Engineering" at the University of Vienna. These simulators allow realize different simulations, analyze the behavior between a base station and a single user, in the case of the link-level simulators or analyze the performance of a network in the case of system-level simulators. With the results that can be obtained from both simulators, will perform a series of questions, based on the practice developed by Pedro García del Pino, Professor of “Universidad Politécnica de Madrid (UPM)”. These questions will be both of a theoretical and practical type, to check that have been understood the analyzed simulators. Finally, it quotes the conclusions obtained from this project and mention the future lines of action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este documento trata de la instalación del sistema de provisión de abonados “Provisioning Gateway” (PG) para el proyecto Machine to Machine (M2M) de Ericsson, una tecnología que permite definir los abonados o dispositivos serán usados en las comunicaciones máquina a máquina. El proyecto ha sido realizado en Ericsson para una compañía de telefonía de España. Se hablará de la evolución que han sufrido las redes de comunicaciones en los últimos años y las aplicaciones que surgen a partir de ellas. Se hará especial hincapié en la tecnología de red más utilizada hoy en día, el 3G, y en su sucesora, el 4G, detallando las novedades que incorpora. Después se describirá el trabajo realizado por el ingeniero de Ericsson durante la instalación del PG (Provisioning Gateway), un nodo de la red a través del cual la compañía telefónica da de alta a los abonados, modifica sus parámetros o los elimina de la base de datos. Se explicarán las distintas fases de las que se compone la instalación, tanto la parte hardware como la parte software y por último las pruebas de verificación con el cliente. Por último se darán algunas conclusiones sobre el trabajo realizado y los conocimientos adquiridos. ABSTRACT This document describes the installation of Provisioning Gateway system (PG) for Machine to Machine (M2M) project in Ericsson, which is a technology that allows companies to define the subscribers that will be used of machine to Machine communications. This project has been developed in Ericsson for a Spanish telecommunication company. The evolution of telecommunication networks is going to be introduced, describing al the services developed in parallel. 3G technology will be describer in a deeper way because it is the most used technology nowadays and 4G because it represents the future in telecommunications. Después se describirá el trabajo realizado por el ingeniero de Ericsson durante la instalación del PG (Provisioning Gateway), un nodo de la red a través del cuál la compañía telefónica da de alta a los abonados, modifica sus parámetros o los elimina de la base de datos. Se explicarán las distintas fases de las que se compone la instalación, tanto la parte hardware como la parte software y por último las pruebas de verificación con el cliente. Furthermore, this document shows the installation process of PG system made by the Ericsson´s engineer. Telecommunication companies use PG for creating, defining and deleting subscribers from the database. The different steps of the installation procedure will be described among this document, talking about hardware and software installation and the final acceptant test. Finally, some conclusions about the work done and experience learned will be exposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El mundo tecnológico está cambiando hacia la optimización en la gestión de recursos gracias a la poderosa influencia de tecnologías como la virtualización y la computación en la nube (Cloud Computing). En esta memoria se realiza un acercamiento a las mismas, desde las causas que las motivaron hasta sus últimas tendencias, pasando por la identificación de sus principales características, ventajas e inconvenientes. Por otro lado, el Hogar Digital es ya una realidad para la mayoría de los seres humanos. En él se dispone de acceso a múltiples tipos de redes de telecomunicaciones (3G, 4G, WI-FI, ADSL…) con más o menos capacidad pero que permiten conexiones a internet desde cualquier parte, en todo momento, y con prácticamente cualquier dispositivo (ordenadores personales, smartphones, tabletas, televisores…). Esto es aprovechado por las empresas para ofrecer todo tipo de servicios. Algunos de estos servicios están basados en el cloud computing sobre todo ofreciendo almacenamiento en la nube a aquellos dispositivos con capacidad reducida, como son los smarthphones y las tabletas. Ese espacio de almacenamiento normalmente está en los servidores bajo el control de grandes compañías. Guardar documentos, videos, fotos privadas sin tener la certeza de que estos no son consultados por alguien sin consentimiento, puede despertar en el usuario cierto recelo. Para estos usuarios que desean control sobre su intimidad, se ofrece la posibilidad de que sea el propio usuario el que monte sus propios servidores y su propio servicio cloud para compartir su información privada sólo con sus familiares y amigos o con cualquiera al que le dé permiso. Durante el proyecto se han comparado diversas soluciones, la mayoría de código abierto y de libre distribución, que permiten desplegar como mínimo un servicio de almacenamiento accesible a través de Internet. Algunas de ellas lo complementan con servicios de streaming tanto de música como de videos, compartición y sincronización de documentos entre múltiples dispositivos, calendarios, copias de respaldo (backups), virtualización de escritorios, versionado de ficheros, chats, etc. El proyecto finaliza con una demostración de cómo utilizar dispositivos de un hogar digital interactuando con un servidor Cloud, en el que previamente se ha instalado y configurado una de las soluciones comparadas. Este servidor quedará empaquetado en una máquina virtual para que sea fácilmente transportable e utilizable. ABSTRACT. The technological world is changing towards optimizing resource management thanks to the powerful influence of technologies such as Virtualization and Cloud Computing. This document presents a closer approach to them, from the causes that have motivated to their last trends, as well as showing their main features, advantages and disadvantages. In addition, the Digital Home is a reality for most humans. It provides access to multiple types of telecommunication networks (3G, 4G, WI-FI, ADSL...) with more or less capacity, allowing Internet connections from anywhere, at any time, and with virtually any device (computer personal smartphones, tablets, televisions...).This is used by companies to provide all kinds of services. Some of these services offer storage on the cloud to devices with limited capacity, such as smartphones and tablets. That is normally storage space on servers under the control of important companies. Saving private documents, videos, photos, without being sure that they are not viewed by anyone without consent, can wake up suspicions in some users. For those users who want control over their privacy, it offers the possibility that it is the user himself to mount his own server and its own cloud service to share private information only with family and friends or with anyone with consent. During the project I have compared different solutions, most open source and with GNU licenses, for deploying one storage facility accessible via the Internet. Some supplement include streaming services of music , videos or photos, sharing and syncing documents across multiple devices, calendars, backups, desktop virtualization, file versioning, chats... The project ends with a demonstration of how to use our digital home devices interacting with a cloud server where one of the solutions compared is installed and configured. This server will be packaged in a virtual machine to be easily transportable and usable.