907 resultados para Distributed network protocol
Resumo:
Deuterium NMR was used to investigate the orientational order in a composite cellulosic formed by liquid crystalline acetoxypropylcellulose (A PC) and demented nematic 4'-penty1-4-cyanobiphenyl (5CB-4 alpha d(2)) with the per centage of 85% A PC by weight Three forms of the composite including electro spun microfibers, thin film and bulk samples were analyzed The NMR results initially suggest two distinct scenarios, one whet e the 503-alpha d(2), is confined to small droplets with dimensions smaller than the magnetic coherence length and the other where the 503-alpha d(2) molecules arc aligned with the A PC network chains Polarized optical microscopy (POW from thin film samples along with all the NMR results show the presence of 5CB-alpha d(2) droplets in the composite systems with a nematic wetting layer at the APC-5CB-alpha d(2) interface that experiences and order disorder transition driven by the polymer network N-I transition The characterization of the APC network I-N transition shows a pronounced subcritical behavior within a heterogeneity scenario.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
As ameaças à segurança da informação, (INFOSEC) atentam contra a perda da respectiva confidencialidade, integridade e disponibilidade, pelo que as organizações são impelidas a implementar políticas de segurança, quer ao nível físico quer ao nível lógico, utilizando mecanismos específicos de defesa. O projecto Network Air Gap Controller (NAGC) foi concebido no sentido de contribuir para as questões da segurança, designadamente daquelas que se relacionam directamente com a transferência de informação entre redes de classificação de segurança diferenciadas ou de sensibilidades distintas, sem requisitos de comunicação em tempo real, e que mereçam um maior empenho nas condições de robustez, de disponibilidade e de controlo. Os organismos que, em razão das atribuições e competências cometidas, necessitam de fazer fluir informação entre este tipo de redes, são por vezes obrigados a realizar a transferência de dados com recurso a um processo manual, efectuado pelo homem e não pela máquina, que envolve dispositivos amovivéis, como sejam o CD, DVD, PEN, discos externos ou switches manuais. Neste processo, vulgarmente designado por Network Air Gap (NAG), o responsável pela transferência de dados deverá assumir de forma infalível, como atribuições intrínsecas e inalienáveis da função exercida, as garantias do cumprimento de um vasto conjunto de normas regulamentares. As regras estabelecidas desdobram-se em ferramentas e procedimentos que se destinam, por exemplo, à guarda em arquivo de todas as transferências efectuadas; à utilização de ferramentas de segurança (ex: antivírus) antes da colocação da informação na rede de classificação mais elevada; ao não consentimento de transferência de determinados tipos de ficheiro (ex: executáveis) e à garantia de que, em consonância com a autonomia que normalmente é delegada no elemento responsável pela operação das comunicações, apenas se efectuam transferências de informação no sentido da rede de classificação inferior para a rede de classificação mais elevada. Face ao valor da informação e do impacto na imagem deste tipo de organizações, o operador de comunicações que não cumpra escrupulosamente o determinado é inexoravelmente afastado dessas funções, sendo que o processo de apuramento de responsabilidades nem sempre poderá determinar de forma inequívoca se as razões apontam para um acto deliberado ou para factores não intencionais, como a inépcia, o descuido ou a fadiga. Na realidade, as actividades periódicas e rotineiras, tornam o homem propenso à falha e poderão ser incontornavelmente asseguradas, sem qualquer tipo de constrangimentos ou diminuição de garantias, por soluções tecnológicas, desde que devidamente parametrizadas, adaptadas, testadas e amadurecidas, libertando os recursos humanos para tarefas de manutenção, gestão, controlo e inspecção. Acresce que, para este tipo de organizações, onde se multiplicam o número de redes de entrada de informação, com diferentes classificações e actores distintos, e com destinatários específicos, a utilização deste tipo de mecanismos assume uma importância capital. Devido a este factor multiplicativo, impõe-se que o NAGC represente uma opção válida em termos de oferta tecnológica, designadamente para uma gama de produtos de baixíssimo custo e que possa desenvolver-se por camadas de contributo complementar, em função das reais necessidades de cada cenário.
Resumo:
A evolução tecnológica e das sociedades permitiu que, hoje em dia, uma boa parte da população tenha acesso a dispositivos móveis com funcionalidades avançadas. Com este tipo de dispositivos, temos acesso a inúmeras fontes de informação em tempo-real, mas esta característica ainda não é, hoje em dia, aproveitada na sua totalidade. Este projecto tenta tirar partido desta realidade para, utilizando os diversos dispositivos móveis, criar uma rede de troca de informações de trânsito. O utilizador apenas necessita de servir-se do seu dispositivo móvel para, automaticamente, obter as mais recentes informações de trânsito enquanto, paralelamente, partilha com os outros utilizadores a sua informação. Apesar de existirem outras alternativas no mercado, com soluções que permitem usufruir do mesmo tipo de funcionalidades, nenhuma utiliza este tipo de dispositivos (GPS’s convencionais, por exemplo). Um dos requisitos necessário na implementação deste projecto é uma solução de geocoding. Após terem sido testadas várias soluções, nenhuma cumpria, na totalidade, os requisitos deste projecto, o que originou o desenvolvimento de uma nova solução que cumpre esses requisitos. A solução é, toda ela, muito modular, formada por vários componentes, cada um com responsabilidades bem identificadas. A arquitectura desta solução baseia-se nos padrões de desenvolvimento de uma Service Oriented Architecture. Todos os componentes disponibilizam as suas operações através de web services, e a sua descoberta recorre ao protocolo WS-Discovery. Estes vários componentes podem ser divididos em duas categorias: os do núcleo, responsáveis por criar e oferecer as funcionalidades requisitadas neste projecto e os módulos externos, nos quais se incluem as aplicações que apresentam as funcionalidades ao utilizador. Foram criadas duas formas de consumir a informação oferecida pelo serviço SIAT: a aplicação móvel e um website. No âmbito dos dispositivos móveis, foi desenvolvida uma aplicação para o sistema operativo Windows Phone 7.
Resumo:
Esta tese tem por objectivo o desenho e avaliação de um sistema de contagem e classificação de veículos automóveis em tempo-real e sem fios. Pretende, também, ser uma alternativa aos actuais equipamentos, muito intrusivos nas vias rodoviárias. Esta tese inclui um estudo sobre as comunicações sem fios adequadas a uma rede de equipamentos sensores rodoviários, um estudo sobre a utilização do campo magnético como meio físico de detecção e contagem de veículos e um estudo sobre a autonomia energética dos equipamentos inseridos na via, com recurso, entre outros, à energia solar. O projecto realizado no âmbito desta tese incorpora, entre outros, a digitalização em tempo real da assinatura magnética deixada pela passagem de um veículo, no campo magnético da Terra, o respectivo envio para servidor via rádio e WAN, Wide Area Network, e o desenvolvimento de software tendo por base a pilha de protocolos ZigBee. Foram desenvolvidas aplicações para o equipamento sensor, para o coordenador, para o painel de controlo e para a biblioteca de Interface de um futuro servidor aplicacional. O software desenvolvido para o equipamento sensor incorpora ciclos de detecção e digitalização, com pausas de adormecimento de baixo consumo, e a activação das comunicações rádio durante a fase de envio, assegurando assim uma estratégia de poupança energética. Os resultados obtidos confirmam a viabilidade desta tecnologia para a detecção e contagem de veículos, assim como para a captura de assinatura usando magnetoresistências. Permitiram ainda verificar o alcance das comunicações sem fios com equipamento sensor embebido no asfalto e confirmar o modelo de cálculo da superfície do painel solar bem como o modelo de consumo energético do equipamento sensor.
Resumo:
The devastating impact of the Sumatra tsunami of 26 December 2004, raised the question for scientists of how to forecast a tsunami threat. In 2005, the IOC-UNESCO XXIII assembly decided to implement a global tsunami warning system to cover the regions that were not yet protected, namely the Indian Ocean, the Caribbean and the North East Atlantic, the Mediterranean and connected seas (the NEAM region). Within NEAM, the Gulf of Cadiz is the more sensitive area, with an important record of devastating historical events. The objective of this paper is to present a preliminary design for a reliable tsunami detection network for the Gulf of Cadiz, based on a network of sea-level observatories. The tsunamigenic potential of this region has been revised in order to define the active tectonic structures. Tsunami hydrodynamic modeling and GIS technology have been used to identify the appropriate locations for the minimum number of sea-level stations. Results show that 3 tsunameters are required as the minimum number of stations necessary to assure an acceptable protection to the large coastal population in the Gulf of Cadiz. In addition, 29 tide gauge stations could be necessary to fully assess the effects of a tsunami along the affected coasts of Portugal, Spain and Morocco.
Resumo:
This paper presents an artificial neural network approach for short-term wind power forecasting in Portugal. The increased integration of wind power into the electric grid, as nowadays occurs in Portugal, poses new challenges due to its intermittency and volatility. Hence, good forecasting tools play a key role in tackling these challenges. The accuracy of the wind power forecasting attained with the proposed approach is evaluated against persistence and ARIMA approaches, reporting the numerical results from a real-world case study.
Resumo:
Nowadays, the cooperative intelligent transport systems are part of a largest system. Transportations are modal operations integrated in logistics and, logistics is the main process of the supply chain management. The supply chain strategic management as a simultaneous local and global value chain is a collaborative/cooperative organization of stakeholders, many times in co-opetition, to perform a service to the customers respecting the time, place, price and quality levels. The transportation, like other logistics operations must add value, which is achieved in this case through compression lead times and order fulfillments. The complex supplier's network and the distribution channels must be efficient and the integral visibility (monitoring and tracing) of supply chain is a significant source of competitive advantage. Nowadays, the competition is not discussed between companies but among supply chains. This paper aims to evidence the current and emerging manufacturing and logistics system challenges as a new field of opportunities for the automation and control systems research community. Furthermore, the paper forecasts the use of radio frequency identification (RFID) technologies integrated into an information and communication technologies (ICT) framework based on distributed artificial intelligence (DAI) supported by a multi-agent system (MAS), as the most value advantage of supply chain management (SCM) in a cooperative intelligent logistics systems. Logistical platforms (production or distribution) as nodes of added value of supplying and distribution networks are proposed as critical points of the visibility of the inventory, where these technological needs are more evident.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde. Área de especialização: Ressonância Magnética
Resumo:
Screening programs, particularly the inclusion of specific orthoptic tests to detect visual abnormalities, varies among countries. This study aims to: 1) describes expert perception of issues related with children visual screening; 2) identify specific orthoptic tests to detect visual abnormalities in children visual screening.
Resumo:
Integrated manufacturing constitutes a complex system made of heterogeneous information and control subsystems. Those subsystems are not designed to the cooperation. Typically each subsystem automates specific processes, and establishes closed application domains, therefore it is very difficult to integrate it with other subsystems in order to respond to the needed process dynamics. Furthermore, to cope with ever growing marketcompetition and demands, it is necessary for manufacturing/enterprise systems to increase their responsiveness based on up-to-date knowledge and in-time data gathered from the diverse information and control systems. These have created new challenges for manufacturing sector, and even bigger challenges for collaborative manufacturing. The growing complexity of the information and communication technologies when coping with innovative business services based on collaborative contributions from multiple stakeholders, requires novel and multidisciplinary approaches. Service orientation is a strategic approach to deal with such complexity, and various stakeholders' information systems. Services or more precisely the autonomous computational agents implementing the services, provide an architectural pattern able to cope with the needs of integrated and distributed collaborative solutions. This paper proposes a service-oriented framework, aiming to support a virtual organizations breeding environment that is the basis for establishing short or long term goal-oriented virtual organizations. The notion of integrated business services, where customers receive some value developed through the contribution from a network of companies is a key element.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
The use of distributed energy resources, based on natural intermittent power sources, like wind generation, in power systems imposes the development of new adequate operation management and control methodologies. A short-term Energy Resource Management (ERM) methodology performed in two phases is proposed in this paper. The first one addresses the day-ahead ERM scheduling and the second one deals with the five-minute ahead ERM scheduling. The ERM scheduling is a complex optimization problem due to the high quantity of variables and constraints. In this paper the main goal is to minimize the operation costs from the point of view of a virtual power player that manages the network and the existing resources. The optimization problem is solved by a deterministic mixedinteger non-linear programming approach. A case study considering a distribution network with 33 bus, 66 distributed generation, 32 loads with demand response contracts and 7 storage units and 1000 electric vehicles has been implemented in a simulator developed in the field of the presented work, in order to validate the proposed short-term ERM methodology considering the dynamic power system behavior.
Resumo:
Power systems have been suffering huge changes mainly due to the substantial increase of distributed generation and to the operation in competitive environments. Virtual power players can aggregate a diversity of players, namely generators and consumers, and a diversity of energy resources, including electricity generation based on several technologies, storage and demand response. Resource management gains an increasing relevance in this competitive context, while demand side active role provides managers with increased demand elasticity. This makes demand response use more interesting and flexible, giving rise to a wide range of new opportunities.This paper proposes a methodology for managing demand response programs in the scope of virtual power players. The proposed method is based on the calculation of locational marginal prices (LMP). The evaluation of the impact of using demand response specific programs on the LMP value supports the manager decision concerning demand response use. The proposed method has been computationally implemented and its application is illustrated in this paper using a 32 bus network with intensive use of distributed generation.