37 resultados para Information integration
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Jornalismo.
Resumo:
Integrated manufacturing constitutes a complex system made of heterogeneous information and control subsystems. Those subsystems are not designed to the cooperation. Typically each subsystem automates specific processes, and establishes closed application domains, therefore it is very difficult to integrate it with other subsystems in order to respond to the needed process dynamics. Furthermore, to cope with ever growing marketcompetition and demands, it is necessary for manufacturing/enterprise systems to increase their responsiveness based on up-to-date knowledge and in-time data gathered from the diverse information and control systems. These have created new challenges for manufacturing sector, and even bigger challenges for collaborative manufacturing. The growing complexity of the information and communication technologies when coping with innovative business services based on collaborative contributions from multiple stakeholders, requires novel and multidisciplinary approaches. Service orientation is a strategic approach to deal with such complexity, and various stakeholders' information systems. Services or more precisely the autonomous computational agents implementing the services, provide an architectural pattern able to cope with the needs of integrated and distributed collaborative solutions. This paper proposes a service-oriented framework, aiming to support a virtual organizations breeding environment that is the basis for establishing short or long term goal-oriented virtual organizations. The notion of integrated business services, where customers receive some value developed through the contribution from a network of companies is a key element.
Resumo:
Electric vehicles (EV) offer a great potential to address the integration of renewable energy sources (RES) in the power grid, and thus reduce the dependence on oil as well as the greenhouse gases (GHG) emissions. The high share of wind energy in the Portuguese energy mix expected for 2020 can led to eventual curtailment, especially during the winter when high levels of hydro generation occur. In this paper a methodology based on a unit commitment and economic dispatch is implemented, and a hydro-thermal dispatch is performed in order to evaluate the impact of the EVs integration into the grid. Results show that the considered 10 % penetration of EVs in the Portuguese fleet would increase load in 3 % and would not integrate a significant amount of wind energy because curtailment is already reduced in the absence of EVs. According to the results, the EV is charged mostly with thermal generation and the associated emissions are much higher than if they were calculated based on the generation mix.
Resumo:
Thesis to obtain the Master of Science Degree in Computer Science and Engineering
Resumo:
This paper develops an energy management system with integration of smart meters for electricity consumers in a smart grid context. The integration of two types of smart meters (SM) are developed: (i) consumer owned SM and (ii) distributor owned SM. The consumer owned SM runs over a wireless platform - ZigBee protocol and the distributor owned SM uses the wired environment - ModBus protocol. The SM are connected to a SCADA system (Supervisory Control And Data Acquisition) that supervises a network of Programmable Logic Controllers (PLC). The SCADA system/PLC network integrates different types of information coming from several technologies present in modern buildings. The developed control strategy implements a hierarchical cascade controller where inner loops are performed by local PLCs, and the outer loop is managed by a centralized SCADA system, which interacts with the entire local PLC network. In order to implement advanced controllers, a communication channel was developed to allow the communication between the SCADA system and the MATLAB software. (C) 2014 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
Resumo:
Sticky information monetary models have been used in the macroeconomic literature to explain some of the observed features regarding inflation dynamics. In this paper, we explore the consequences of relaxing the rational expectations assumption usually taken in this type of model; in particular, by considering expectations formed through adaptive learning, it is possible to arrive to results other than the trivial convergence to a fixed point long-term equilibrium. The results involve the possibility of endogenous cyclical motion (periodic and a-periodic), which emerges essentially in scenarios of hyperinflation. In low inflation settings, the introduction of learning implies a less severe impact of monetary shocks that, nevertheless, tend to last for additional time periods relative to the pure perfect foresight setup.
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Este trabalho foi efectuado com o propósito de interpretar, compreender e explicar algumas ferramentas de simulação de processos, em particular o Aspen Energy Analyzer (AEA), o Aspen Economic Evaluation (AEE) e o seu funcionamento integrado com o Aspen Hysys(AH). O AH é uma ferramenta de modelação de processos para a concepção de projectos de engenharia química, o AEA é uma ferramenta de modelação de redes de integração energética. O AEE integrado no AH é uma ferramenta que permite incorporar estudos económicos numa fase preliminar do desenvolvimento de um projecto de engenharia. A abordagem a este trabalho foi efectuada através do estudo de Casos. O Caso I foi baseado na resolução de um problema no AEA através da construção e optimização de uma rede de permutadores de calor. Os Casos II e III foram baseados na construção de um flowsheet de produção de Benzeno e de Cloreto de Vinil, respectivamente, e cada Caso foi dividido em dois cenários diferentes. Para o efeito foram utilizados os softwares AEA para a integração energética dos processos, o AH para construção do fluxograma do processo e o AEE para os estudos económicos dos diferentes cenários. Finalmente, os Casos IV e V dizem respeito à resolução de um problema de integração energética. O Caso IV foi baseado num problema de optimização da rede de permutadores através do aumento da sua área. Já o Caso V foi baseado na informação inicial das correntes do caso anterior e em dois cenários diferentes, nos quais foi estudada a influência dos preços das utilidades na construção da rede de permutadores. A conclusão foi que as ferramentas de modelação, particularmente o AH, o AEA e o AEE são uma mais-valia extraordinária para ajudar o utilizador na tomada de decisões em fases bastante preliminares da engenharia de processos.
Resumo:
A compactação dos equipamentos de tecnologia de informação e os aumentos simultâneos no consumo de energia dos processadores levam a que seja assegurada a distribuição adequada de ar frio, a remoção do ar quente, a capacidade adequada de arrefecimento e uma diminuição do consumo de energia. Considerando-se a cogeração como uma alternativa energeticamente eficiente em relação a outros métodos de produção de energia, com este trabalho faz-se a análise à rentabilidade de uma eventual integração de um sistema de cogeração num centro informático.
Resumo:
Este trabalho sugere uma solução de integração de dados em tempo real no contexto dos transportes públicos. Com o aumento das alternativas oferecidas aos utilizadores dos transportes públicos é importante que estes conheçam todas as alternativas com base em informação em tempo real para que realizem a escolha que melhor se enquadre às suas necessidades. Por outro lado, os operadores de transportes públicos deverão ser capazes de disponibilizar toda a informação pretendida com o mínimo de esforço ou de alterações ao sistema que têm implementado. Neste trabalho serão utilizadas ferramentas que permitem fornecer uma visão homogénea das várias fontes de dados heterogéneas, sendo essa homogeneidade o ponto de integração de todas as fontes de dados com as aplicações cliente.
Resumo:
O trabalho apresentado por este documento aborda os problemas que advêm da necessidade de integração de aplicações, desenvolvidas em diferentes instantes no tempo, por diferentes equipas de trabalho, que para enriquecer os processos de negócio necessitam de comunicar entre si. A integração das aplicações tem de ser feita de forma opaca para estas, sendo disponibilizada por uma peça de software genérica, robusta e sem custos para as equipas desenvolvimento, na altura da integração. Esta integração tem de permitir que as aplicações comuniquem utilizando os protocolos que desejarem. Este trabalho propõe um middleware orientado a mensagens como solução para o problema identificado. A solução apresentada por este trabalho disponibiliza a comunicação entre aplicações que utilizam diferentes protocolos, permite ainda o desacoplamento temporal, espacial e de sincronismo na comunicação das aplicações. A implementação da solução tem base num sistema publish/subscribe orientado ao conteúdo e tem de lidar com as maiores exigências computacionais que este tipo de sistema acarta, sendo que a utilização deste se justifica com o enriquecimento da semântica de subscrição de eventos. Esta implementação utiliza uma arquitectura semi-distribuída, com o objectivo de aumentar a escalabilidade do sistema. A utilização da arquitectura semi-distribuída implica que a implementação da solução tem de lidar com o encaminhamento de eventos e divulgação das subscrições, pelos vários servidores de eventos. A implementação da solução disponibiliza garantias de persistência, processamento transaccional e tolerância a falhas, assim como transformação de eventos entre os diversos protocolos. A extensibilidade da solução é conseguida à custa de um sistema de pluggins que permite a adição de suporte a novos protocolos de comunicação. Os protocolos suportados pela implementação final do trabalho são RestMS e TCP.
Resumo:
It is proposed a new approach based on a methodology, assisted by a tool, to create new products in the automobile industry based on previous defined processes and experiences inspired on a set of best practices or principles: it is based on high-level models or specifications; it is component-based architecture centric; it is based on generative programming techniques. This approach follows in essence the MDA (Model Driven Architecture) philosophy with some specific characteristics. We propose a repository that keeps related information, such as models, applications, design information, generated artifacts and even information concerning the development process itself (e.g., generation steps, tests and integration milestones). Generically, this methodology receives the users' requirements to a new product (e.g., functional, non-functional, product specification) as its main inputs and produces a set of artifacts (e.g., design parts, process validation output) as its main output, that will be integrated in the engineer design tool (e.g. CAD system) facilitating the work.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
Preliminary version