964 resultados para Data Warehousing Systems
Resumo:
A collection of slides from the authorpsilas seminar presentation is given
Resumo:
Ethernet is becoming the dominant aggregation technology for carrier transport networks; however, as it is a LAN technology, native bridged ethernet does not fulfill all the carrier requirements. One of the schemes proposed by the research community to make ethernet fulfill carrier requirements is ethernet VLAN-label switching (ELS). ELS allows the creation of label switched data paths using a 12-bit label encoded in the VLAN TAG control information field. Previous label switching technologies such as MPLS use more bits for encoding the label. Hence, they do not suffer from label sparsity issues as ELS might. This paper studies the sparsity issues resulting from the reduced ELS VLAN-label space and proposes the use of the label merging technique to improve label space usage. Experimental results show that label merging considerably improves label space usage
Resumo:
Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio
Resumo:
This paper presents a study of connection availability in GMPLS over optical transport networks (OTN) taking into account different network topologies. Two basic path protection schemes are considered and compared with the no protection case. The selected topologies are heterogeneous in geographic coverage, network diameter, link lengths, and average node degree. Connection availability is also computed considering the reliability data of physical components and a well-known network availability model. Results show several correspondences between suitable path protection algorithms and several network topology characteristics
Resumo:
Supervisory systems evolution makes the obtaining of significant information from processes more important in the way that the supervision systems' particular tasks are simplified. So, having signal treatment tools capable of obtaining elaborate information from the process data is important. In this paper, a tool that obtains qualitative data about the trends and oscillation of signals is presented. An application of this tool is presented as well. In this case, the tool, implemented in a computer-aided control systems design (CACSD) environment, is used in order to give to an expert system for fault detection in a laboratory plant
Resumo:
The design of control, estimation or diagnosis algorithms most often assumes that all available process variables represent the system state at the same instant of time. However, this is never true in current network systems, because of the unknown deterministic or stochastic transmission delays introduced by the communication network. During the diagnosing stage, this will often generate false alarms. Under nominal operation, the different transmission delays associated with the variables that appear in the computation form produce discrepancies of the residuals from zero. A technique aiming at the minimisation of the resulting false alarms rate, that is based on the explicit modelling of communication delays and on their best-case estimation is proposed
Resumo:
Existen en la actualidad múltiples modelos de gestión de conocimiento y medición del capital humano, los cuales son aplicados en las organizaciones, pero ninguno de éstos ha sido diseñado para Instituciones de Educación Superior. En este trabajo se hace un recuento de algunos de los modelos de gestión del conocimiento y capital intelectual más destacados como el Modelo de conversión del conocimiento de Nonaka y Takeuchi, el Modelo de GC de Arthur Andersen, el Cuadro de Mando Integral de Kaplan y Norton, entre otros, pero es a partir del Modelo Organizacional Estrella de Galbraith que se presenta una propuesta teórica para caracterizar un modelo de gestión del conocimiento aplicable a las funciones universitarias de investigación y extensión en la Universidad CES – Medellín, Colombia, a través de una investigación cualitativa en donde, a partir de la correlación entre la teoría general de la GC, particularmente de los modelos y el análisis de las características de la Universidad CES, así como la revisión sistemática, el grupo focal y el análisis documental se propone el Modelo Hexagonal de GC.
Resumo:
Ethernet está empezando a pasar de las redes de área local a una red de transporte. Sin embargo, como los requisitos de las redes de transporte son más exigentes, la tecnología necesita ser mejorada. Esquemas diseñados para mejorar Ethernet para que cumpla con las necesidades de transporte se pueden categorizar en dos clases. La primera clase mejora solo los componentes de control de Ethernet (Tecnologías basadas en STP), y la segunda clase mejora tanto componentes de control como de encaminamiento de Ethernet (tecnologías basadas en etiquetas). Esta tesis analiza y compara el uso de espacio en las etiquetas de las tecnologias basadas en ellas para garantizar su escalabilidad. La aplicabilidad de las técnicas existentes y los estudios que se pueden utilizar para superar o reducir los problemas de escalabilidad de la etiqueta son evaluados. Además, esta tesis propone un ILP para calcular el óptimo rendimiento de las technologias basadas en STP y las compara con las basadas en etiquetas para ser capaz de determinar, dada una específica situacion, que technologia utilizar.
Resumo:
We review the procedures and challenges that must be considered when using geoid data derived from the Gravity and steady-state Ocean Circulation Explorer (GOCE) mission in order to constrain the circulation and water mass representation in an ocean 5 general circulation model. It covers the combination of the geoid information with timemean sea level information derived from satellite altimeter data, to construct a mean dynamic topography (MDT), and considers how this complements the time-varying sea level anomaly, also available from the satellite altimeter. We particularly consider the compatibility of these different fields in their spatial scale content, their temporal rep10 resentation, and in their error covariances. These considerations are very important when the resulting data are to be used to estimate ocean circulation and its corresponding errors. We describe the further steps needed for assimilating the resulting dynamic topography information into an ocean circulation model using three different operational fore15 casting and data assimilation systems. We look at methods used for assimilating altimeter anomaly data in the absence of a suitable geoid, and then discuss different approaches which have been tried for assimilating the additional geoid information. We review the problems that have been encountered and the lessons learned in order the help future users. Finally we present some results from the use of GRACE geoid in20 formation in the operational oceanography community and discuss the future potential gains that may be obtained from a new GOCE geoid.
Resumo:
The need for consistent assimilation of satellite measurements for numerical weather prediction led operational meteorological centers to assimilate satellite radiances directly using variational data assimilation systems. More recently there has been a renewed interest in assimilating satellite retrievals (e.g., to avoid the use of relatively complicated radiative transfer models as observation operators for data assimilation). The aim of this paper is to provide a rigorous and comprehensive discussion of the conditions for the equivalence between radiance and retrieval assimilation. It is shown that two requirements need to be satisfied for the equivalence: (i) the radiance observation operator needs to be approximately linear in a region of the state space centered at the retrieval and with a radius of the order of the retrieval error; and (ii) any prior information used to constrain the retrieval should not underrepresent the variability of the state, so as to retain the information content of the measurements. Both these requirements can be tested in practice. When these requirements are met, retrievals can be transformed so as to represent only the portion of the state that is well constrained by the original radiance measurements and can be assimilated in a consistent and optimal way, by means of an appropriate observation operator and a unit matrix as error covariance. Finally, specific cases when retrieval assimilation can be more advantageous (e.g., when the estimate sought by the operational assimilation system depends on the first guess) are discussed.
Resumo:
The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.
Resumo:
The Bollène-2002 Experiment was aimed at developing the use of a radar volume-scanning strategy for conducting radar rainfall estimations in the mountainous regions of France. A developmental radar processing system, called Traitements Régionalisés et Adaptatifs de Données Radar pour l’Hydrologie (Regionalized and Adaptive Radar Data Processing for Hydrological Applications), has been built and several algorithms were specifically produced as part of this project. These algorithms include 1) a clutter identification technique based on the pulse-to-pulse variability of reflectivity Z for noncoherent radar, 2) a coupled procedure for determining a rain partition between convective and widespread rainfall R and the associated normalized vertical profiles of reflectivity, and 3) a method for calculating reflectivity at ground level from reflectivities measured aloft. Several radar processing strategies, including nonadaptive, time-adaptive, and space–time-adaptive variants, have been implemented to assess the performance of these new algorithms. Reference rainfall data were derived from a careful analysis of rain gauge datasets furnished by the Cévennes–Vivarais Mediterranean Hydrometeorological Observatory. The assessment criteria for five intense and long-lasting Mediterranean rain events have proven that good quantitative precipitation estimates can be obtained from radar data alone within 100-km range by using well-sited, well-maintained radar systems and sophisticated, physically based data-processing systems. The basic requirements entail performing accurate electronic calibration and stability verification, determining the radar detection domain, achieving efficient clutter elimination, and capturing the vertical structure(s) of reflectivity for the target event. Radar performance was shown to depend on type of rainfall, with better results obtained with deep convective rain systems (Nash coefficients of roughly 0.90 for point radar–rain gauge comparisons at the event time step), as opposed to shallow convective and frontal rain systems (Nash coefficients in the 0.6–0.8 range). In comparison with time-adaptive strategies, the space–time-adaptive strategy yields a very significant reduction in the radar–rain gauge bias while the level of scatter remains basically unchanged. Because the Z–R relationships have not been optimized in this study, results are attributed to an improved processing of spatial variations in the vertical profile of reflectivity. The two main recommendations for future work consist of adapting the rain separation method for radar network operations and documenting Z–R relationships conditional on rainfall type.
Resumo:
The DIAMET (DIAbatic influences on Mesoscale structures in ExTratropical storms) project aims to improve forecasts of high-impact weather in extratropical cyclones through field measurements, high-resolution numerical modeling, and improved design of ensemble forecasting and data assimilation systems. This article introduces DIAMET and presents some of the first results. Four field campaigns were conducted by the project, one of which, in late 2011, coincided with an exceptionally stormy period marked by an unusually strong, zonal North Atlantic jet stream and a succession of severe windstorms in northwest Europe. As a result, December 2011 had the highest monthly North Atlantic Oscillation index (2.52) of any December in the last 60 years. Detailed observations of several of these storms were gathered using the UK’s BAe146 research aircraft and extensive ground-based measurements. As an example of the results obtained during the campaign, observations are presented of cyclone Friedhelm on 8 December 2011, when surface winds with gusts exceeding 30 m s-1 crossed central Scotland, leading to widespread disruption to transportation and electricity supply. Friedhelm deepened 44 hPa in 24 hours and developed a pronounced bent-back front wrapping around the storm center. The strongest winds at 850 hPa and the surface occurred in the southern quadrant of the storm, and detailed measurements showed these to be most intense in clear air between bands of showers. High-resolution ensemble forecasts from the Met Office showed similar features, with the strongest winds aligned in linear swaths between the bands, suggesting that there is potential for improved skill in forecasts of damaging winds.
Resumo:
Uncertainty in ocean analysis methods and deficiencies in the observing system are major obstacles for the reliable reconstruction of the past ocean climate. The variety of existing ocean reanalyses is exploited in a multi-reanalysis ensemble to improve the ocean state estimation and to gauge uncertainty levels. The ensemble-based analysis of signal-to-noise ratio allows the identification of ocean characteristics for which the estimation is robust (such as tropical mixed-layer-depth, upper ocean heat content), and where large uncertainty exists (deep ocean, Southern Ocean, sea ice thickness, salinity), providing guidance for future enhancement of the observing and data assimilation systems.