943 resultados para Vienna. Bürgerliches zeughaus.
Resumo:
Este proyecto, recoge el estudio de diferentes simuladores sobre comunicaciones móviles, que se encargan de analizar el comportamiento de las tecnologías UMTS (Universal Mobile Telecommunications System), 3G y LTE (Long Term Evolution),3.9G, centrándose principalmente en el caso de los simuladores LTE, ya que es la tecnología que se está implantando en la actualidad. Por ello, antes de analizar las características de la interfaz radio más importante de esta generación, la 3.9G, se hará una overview general de cómo han ido evolucionando las comunicaciones móviles a lo largo de la historia, se analizarán las características de la tecnología móvil actual, la 3.9G, para posteriormente centrarse en un par de simuladores que demostrarán, mediante resultados gráficos, estas características. Hoy en día, el uso de estos simuladores es totalmente necesario, ya que las comunicaciones móviles, avanzan a un ritmo vertiginoso y es necesario por lo tanto conocer las prestaciones que pueden producir las diferentes tecnologías móviles utilizadas. Los simuladores utilizados por este proyecto, permiten analizar el comportamiento de varios escenarios, ya que existen diferentes tipos de simuladores, tanto a nivel de enlace como a nivel de sistema. Se mencionarán una serie de simuladores correspondientes a la tercera generación UMTS, pero los simuladores en cuestión que se estudiarán y analizarán con más profundidad en este proyecto fin de carrera son los simuladores “Link-Level” y “System-Level”, desarrollados por el “Institute of Communications and Radio-Frecuency Engineering” de la Universidad de Viena. Estos simuladores permiten realizar diferentes simulaciones, como analizar el comportamiento entre una estación base y un único usuario, para el caso de los simuladores a nivel de enlace, o bien analizar el comportamiento de toda una red en el caso de los simuladores a nivel de sistema. Con los resultados que se pueden obtener de ambos simuladores, se realizarán una serie de preguntas, basadas en la práctica realizada por el profesor de la universidad Politécnica de Madrid, Pedro García del Pino, tanto de tipo teóricas como de tipo prácticas, para comprobar que se han entendido los simuladores analizados. Finalmente se citarán las conclusiones que se obtiene de este proyecto, así como las líneas futuras de acción. PROJECT ABSTRACT This project includes the study of different simulators on mobile communications, which are responsible for analyzing the behavior of UMTS (Universal Mobile Telecommunications System), 3G and LTE (Long Term Evolution), 3.9G, mainly focusing on the case of LTE simulators because it is the technology that is being implemented today. Therefore, before analyzing the characteristics of the most important radio interface of this generation, 3.9G, there will give a general overview how the mobile communications have evolved throughout history, analyzing the characteristics of current mobile technology, the 3.9G, later focus on a pair of simulators that demonstrate through graphical results, these characteristics. Today, the use of these simulators is absolutely necessary, because mobile communications advance at a high rate, and it is necessary to know the features that can produce different mobile technologies that are used. The simulators used for this project, allow to analyze the behavior of several scenarios, as there are different types of simulators, both link and system level. It mentioned a number of simulators for the third generation UMTS, but the simulators in question to be studied and analyzed in this final project are the simulators "Link-Level" and "System-Level", developed by the "Institute of Communications and Radio-Frequency Engineering" at the University of Vienna. These simulators allow realize different simulations, analyze the behavior between a base station and a single user, in the case of the link-level simulators or analyze the performance of a network in the case of system-level simulators. With the results that can be obtained from both simulators, will perform a series of questions, based on the practice developed by Pedro García del Pino, Professor of “Universidad Politécnica de Madrid (UPM)”. These questions will be both of a theoretical and practical type, to check that have been understood the analyzed simulators. Finally, it quotes the conclusions obtained from this project and mention the future lines of action.
Resumo:
In recent years, remote sensing imaging systems for the measurement of oceanic sea states have attracted renovated attention. Imaging technology is economical, non-invasive and enables a better understanding of the space-time dynamics of ocean waves over an area rather than at selected point locations of previous monitoring methods (buoys, wave gauges, etc.). We present recent progress in space-time measurement of ocean waves using stereo vision systems on offshore platforms, which focus on sea states with wavelengths in the range of 0.01 m to 1 m. Both traditional disparity-based systems and modern elevation-based ones are presented in a variational optimization framework: the main idea is to pose the stereoscopic reconstruction problem of the surface of the ocean in a variational setting and design an energy functional whose minimizer is the desired temporal sequence of wave heights. The functional combines photometric observations as well as spatial and temporal smoothness priors. Disparity methods estimate the disparity between images as an intermediate step toward retrieving the depth of the waves with respect to the cameras, whereas elevation methods estimate the ocean surface displacements directly in 3-D space. Both techniques are used to measure ocean waves from real data collected at offshore platforms in the Black Sea (Crimean Peninsula, Ukraine) and the Northern Adriatic Sea (Venice coast, Italy). Then, the statistical and spectral properties of the resulting observed waves are analyzed. We show the advantages and disadvantages of the presented stereo vision systems and discuss future lines of research to improve their performance in critical issues such as the robustness of the camera calibration in spite of undesired variations of the camera parameters or the processing time that it takes to retrieve ocean wave measurements from the stereo videos, which are very large datasets that need to be processed efficiently to be of practical usage. Multiresolution and short-time approaches would improve efficiency and scalability of the techniques so that wave displacements are obtained in feasible times.
Resumo:
A methodology has been developed for characterising the mechanical behaviour of concrete, based on the damaged plasticity model, enriched with a user subroutine (V)USDFLD in order to capture better the ductility of the material under moderate confining pressures. The model has been applied in the context of the international benchmark IRIS_2012, organised by the OECD/NEA/CSNI Nuclear Energy Agency, dealing with impacts of rigid and deformable missiles against reinforced concrete targets. A slightly modified version of the concrete damaged plasticity model was used to represent the concrete. The simulation results matched very well the observations made during the actual tests. Particularly successful predictions involved the energy spent by the rigid missile in perforating the target, the crushed length of the deformable missile, the crushed and cracked areas of the concrete target, and the values of the strains recorded at a number of locations in the concrete slab.
Resumo:
Observation has widely shown for nearly all last century that the Spanish (Dynamic) Maritime Climate was following around 10 to 11 year cycles in its most significant figure, wind wave, despite it being better to register cycles of 20 to 22 years, in analogical way with the semi-diurnal and diurnal cycles of Cantabrian tides. Those cycles were soon linked to sun activity and, at the end of the century, the latter was related to the Solar System evolution. We know now that waves and storm surges are coupled and that (Dynamic) Maritime Climate forms part of a more complex “Thermal Machine” including Hydrological cycle. The analysis of coastal floods could so facilitate the extension of that experience. According to their immediate cause, simple flood are usually sorted out into flash, pluvial, fluvial, groundwater and coastal types, considering the last as caused by sea waters. But the fact is that most of coastal floods are the result of the concomitance of several former simple types. Actually, the several Southeastern Mediterranean coastal flood events show to be the result of the superposition within the coastal zone of flash, fluvial, pluvial and groundwater flood types under boundary condition imposed by the concomitant storm sea level rise. This work shall be regarded as an attempt to clarify that cyclic experience, through an in-depth review of a past flood events in Valencia (Turia and Júcar basins), as in Murcia (Segura’s) as well.
Resumo:
The Santa Irene flood event, at the end of October 1982, is one of the most dramatically widely reported flood events in Spain. Its renown is mainly due to the collapse of the Tous dam, but its main message is to be the paradigm of the incidence of the maritime/littoral weather and its temporal sea level rise by storm surge accompanying rain process on the coastal plains inland floods. Looking at damages the presentation analyzes the adapted measures from the point of view of the aims of the FP7 SMARTeST Project related to the Flood Resilience improvement in urban areas through looking for Technologies, Systems and Tools an appropriate "road to de market".
Resumo:
Plant allergens have hitherto been included in only several protein families that share no common biochemical features. Their physical, biochemical and immunological characteristics have been widely studied, but no definite conclusion has been reached about what makes a protein an allergen. N-glycosylation is characteristic of plant allergen sources but is not present in mammals.
Resumo:
Remote reprogramming capabilities are one of the major concerns in WSN platforms due to the limitations and constraints that low power wireless nodes poses, especially when energy efficiency during the reprogramming process is a critical factor for extending the battery life of the devices. Moreover, WSNs are based on low-rate protocols in which as greater the amount of data is sent, the more the possibility to lose packets during the transmitting process is. In order to overcome these limitations, in this work a novel on-the-fly reprogramming technique for modifying and updating the application running on the wireless sensor nodes is designed and implemented, based on a partial reprogramming mechanism that significantly reduces the size of the files to be downloaded to the nodes, therefore diminishing their power/time consumption. This powerful mechanism also addresses multi-experimental capabilities because it provides the possibility to download, manage, test and debug multiple applications into the wireless nodes, based on a memory map segmentation of the core. Being an on-the-fly reprogramming process, no additional resources to store and download the configuration file are needed.
Resumo:
Within our study of the plausibility of a subglacial lake under the Amundsenisen Icefield in Southern Spitzbergen, Svalbard achipelago (Glowacki et al., 2007), here we focus on the sensitivity of the system to the thermal effect of the firn and snow layers. Rough heat balance analysis shows that the firn layer plays an important role by driving the heat release to the atmosphere, so that its influence on the ice-water phase transition cannot be neglected (Bucchignani et al., 2012).
Resumo:
Type 1 diabetes-mellitus implies a life-threatening absolute insulin deficiency. Artificial pancreas (CGM sensor, insulin pump and control algorithm) is promising to outperform current open-loop therapies.
Resumo:
Gestational Diabetes (GD) has increased over the last 20 years, affecting up to 15% of pregnant women worldwide. The complications associated can be reduced with the appropriate glycemic control during the pregnancy.
Resumo:
Crossed-arch vaults are a particular type of ribbed vaults. Their main feature is that the ribs that form the vault are intertwined, forming polygons or stars and leaving an empty space in the middle. The firsts appear in Córdoba in the second half of the 10th Century. Afterwards, the type diffused through Spain and North Africa, 11th_13th Centuries. These vaults reappear in Armenia in the 13th Century. In the 14th and 15th Century a few examples are found both in England (Durham, Raby) and Central Europe (Prague, Landshut, Vienna). At about the same time, Leonardo da Vinci produced designs for the Tiburio (Ciborium) of Milan cathedral with a cross-arched structure and proposed tests to assess the strength; he also, made use of the same pattern of vault for Renaissance centralized churches. Eventually, the type can be tracked through the 17th (Guarini) and 18th (Vittone) Centuries, until Spanish post war architecture in the 1940-60s (Moya). Some questions arose, which so far, have not been answered. How was it possible that a particular type of vault had such enormous geographical spread? How was it transmitted from Córdoba to the Caucasus? The matter is one of transfer of knowledge, ideas, and technology; it relates both aesthetics and construction.
Resumo:
Singular-value decomposition (SVD)-based multiple-input multiple output (MIMO) systems, where the whole MIMO channel is decomposed into a number of unequally weighted single-input single-output (SISO) channels, have attracted a lot of attention in the wireless community. The unequal weighting of the SISO channels has led to intensive research on bit- and power allocation even in MIMO channel situation with poor scattering conditions identified as the antennas correlation effect. In this situation, the unequal weighting of the SISO channels becomes even much stronger. In comparison to the SVD-assisted MIMO transmission, geometric mean decomposition (GMD)-based MIMO systems are able to compensate the drawback of weighted SISO channels when using SVD, where the decomposition result is nearly independent of the antennas correlation effect. The remaining interferences after the GMD-based signal processing can be easily removed by using dirty paper precoding as demonstrated in this work. Our results show that GMD-based MIMO transmission has the potential to significantly simplify the bit and power loading processes and outperforms the SVD-based MIMO transmission as long as the same QAM-constellation size is used on all equally-weighted SISO channels.
Resumo:
The development of mixed-criticality virtualized multi-core systems poses new challenges that are being subject of active research work. There is an additional complexity: it is now required to identify a set of partitions, and allocate applications to partitions. In this job, a number of issues have to be considered, such as the criticality level of the application, security and dependability requirements, time requirements granularity, etc. MultiPARTES [11] toolset relies on Model Driven Engineering (MDE), which is a suitable approach in this setting, as it helps to bridge the gap between design issues and partitioning concerns. MDE is changing the way systems are developed nowadays, reducing development time. In general, modelling approaches have shown their benefits when applied to embedded systems. These benefits have been achieved by fostering reuse with an intensive use of abstractions, or automating the generation of boiler-plate code.
Resumo:
This paper presents a novel vehicle to vehicle energy exchange market (V2VEE) between electric vehicles (EVs) for decreasing the energy cost to be paid by some users whose EVs must be recharged during the day to fulfil their daily scheduled trips and also reducing the impact of charging on the electric grid. EVs with excess of energy in their batteries can transfer this energy among other EVs which need charge during their daily trips. These second type of owners can buy the energy directly to the electric grid or they can buy the energy from other EV at lower price. An aggregator is responsible for collecting all information among vehicles located in the same area at the same time and make possible this energy transfer.