29 resultados para information transmission

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia da Manutenção

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper suggests that the thought of the North-American critical theorist James W. Carey provides a relevant perspective on communication and technology. Having as background American social pragmatism and progressive thinkers of the beginning of the 20th century (as Dewey, Mead, Cooley, and Park), Carey built a perspective that brought together the political economy of Harold A. Innis, the social criticism of David Riesman and Charles W. Mills and incorporated Marxist topics such as commodification and sociocultural domination. The main goal of this paper is to explore the connection established by Carey between modern technological communication and what he called the “transmissive model”, a model which not only reduces the symbolic process of communication to instrumentalization and to information delivery, but also politically converges with capitalism as well as power, control and expansionist goals. Conceiving communication as a process that creates symbolic and cultural systems, in which and through which social life takes place, Carey gives equal emphasis to the incorporation processes of communication.If symbolic forms and culture are ways of conditioning action, they are also influenced by technological and economic materializations of symbolic systems, and by other conditioning structures. In Carey’s view, communication is never a disembodied force; rather, it is a set of practices in which co-exist conceptions, techniques and social relations. These practices configure reality or, alternatively, can refute, transform and celebrate it. Exhibiting sensitiveness favourable to the historical understanding of communication, media and information technologies, one of the issues Carey explored most was the history of the telegraph as an harbinger of the Internet, of its problems and contradictions. For Carey, Internet was seen as the contemporary heir of the communications revolution triggered by the prototype of transmission technologies, namely the telegraph in the 19th century. In the telegraph Carey saw the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of conflict of interest for the control over patents; an inducer of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. After a brief approach to Carey’s communication theory, this paper focuses on his seminal essay "Technology and ideology. The case of the telegraph", bearing in mind the prospect of the communication revolution introduced by Internet. We maintain that this essay has seminal relevance for critically studying the information society. Our reading of it highlights the reach, as well as the problems, of an approach which conceives the innovation of the telegraph as a metaphor for all innovations, announcing the modern stage of history and determining to this day the major lines of development in modern communication systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is on a simulation for offshore wind systems in deep water under cloud scope. The system is equipped with a permanent magnet synchronous generator and a full-power three-level converter, converting the electric energy at variable frequency in one at constant frequency. The control strategies for the three-level are based on proportional integral controllers. The electric energy is injected through a HVDC transmission submarine cable into the grid. The drive train is modeled by a three-mass model taking into account the resistant stiffness torque, structure and tower in the deep water due to the moving surface elevation. Conclusions are taken on the influence of the moving surface on the energy conversion. © IFIP International Federation for Information Processing 2015.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sticky information monetary models have been used in the macroeconomic literature to explain some of the observed features regarding inflation dynamics. In this paper, we explore the consequences of relaxing the rational expectations assumption usually taken in this type of model; in particular, by considering expectations formed through adaptive learning, it is possible to arrive to results other than the trivial convergence to a fixed point long-term equilibrium. The results involve the possibility of endogenous cyclical motion (periodic and a-periodic), which emerges essentially in scenarios of hyperinflation. In low inflation settings, the introduction of learning implies a less severe impact of monetary shocks that, nevertheless, tend to last for additional time periods relative to the pure perfect foresight setup.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O sector dos edifícios é responsável por uma percentagem significativa dos consumos de energia primária e de energia eléctrica em Portugal, associada principalmente ao conforto térmico dos seus ocupantes. A União Europeia pretende uma redução de 20%, até 2020, dos consumos de energia e consequentes emissões de CO2 através da melhoria da eficiência energética dos edifícios públicos e residenciais. Em Portugal, o Plano Nacional para a Eficiência Energética (PNAEE) tem por objectivo obter uma poupança anual de energia de pelo menos 1% até ao ano de 2016, tomando como base a média de consumos de energia final, registados entre 2001 e 2005. Neste contexto, os edifícios anteriores a 1990 (primeira versão do RCCTE) podem apresentar um potencial significativo de melhoria da sua eficiência energética com base na sua reabilitação. Os edifícios “Gaioleiros” (1880 – 1930) constituindo uma parte importante do património histórico da cidade de Lisboa, para os quais a informação sobre o seu desempenho térmico é limitada, considerou-se pertinente efectuar um estudo destinado à sua caracterização experimental e numérica, face à especificidade do comportamento térmico das suas paredes caracterizadas pela elevada espessura. No presente trabalho, apresenta-se a metodologia e os resultados experimentais da medição da resistência térmica das paredes e da medição das necessidades térmicas de aquecimento da habitação. Estes resultados experimentais foram utilizados na validação do modelo de simulação térmica da habitação, que posteriormente serviu para avaliar as suas necessidades térmicas de aquecimento (Nic) e de arrefecimento (Nvc), identificar oportunidades de melhoria e avaliar o respectivo potencial de reabilitação. Neste trabalho, como contributos para uma reabilitação sustentável, apresentam-se avaliações de oportunidades de melhoria com base em estratégias de reforço do isolamento térmico. Dos resultados obtidos concluiu-se que melhorando o isolamento térmico das paredes e vãos envidraçados é possível baixar consideravelmente os consumos de energia associados à habitação, cumprindo assim as exigências estabelecidas no RCCTE ao nível dos requisitos de qualidade térmica da envolvente e consumos energéticos para edifícios novos e grandes reabilitações.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A oferta de serviços baseados em comunicações sem fios tem vindo a crescer exponencialmente na última década. Cada vez mais são exigidas maiores taxas de transmissão assim como uma melhor QoS, sem comprometer a potência de transmissão ou argura de banda disponível. A tecnologia MIMO consegue oferecer um aumento da capacidade destes sistemas sem requerer aumento da largura de banda ou da potência transmitida. O trabalho desenvolvido nesta dissertação consistiu no estudo dos sistemas MIMO, caracterizados pela utilização de múltiplas antenas para transmitir e receber a informação. Com um sistema deste tipo consegue-se obter um ganho de diversidade espacial utilizando códigos espaço-temporais, que exploram simultaneamente o domínio espacial e o domínio do tempo. Nesta dissertação é dado especial ênfase à codificação por blocos no espaço-tempo de Alamouti, a qual será implementada em FPGA, nomeadamente a parte de recepção. Esta implementação é efectuada para uma configuração de antenas 2x1, utilizando vírgula flutuante e para três tipos de modulação: BPSK, QPSK e 16-QAM. Por fim será analisada a relação entre a precisão alcançada na representação numérica dos resultados e os recursos consumidos pela FPGA. Com a arquitectura adoptada conseguem se obter taxas de transferência na ordem dos 29,141 Msimb/s (sem pipelines) a 262,674 Msimb/s (com pipelines), para a modulação BPSK.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho utiliza uma estrutura pin empilhada, baseada numa liga de siliceto de carbono amorfo hidrogenado (a-Si:H e/ou a-SiC:H), que funciona como filtro óptico na zona visível do espectro electromagnético. Pretende-se utilizar este dispositivo para realizar a demultiplexagem de sinais ópticos e desenvolver um algoritmo que permita fazer o reconhecimento autónomo do sinal transmitido em cada canal. O objectivo desta tese visa implementar um algoritmo que permita o reconhecimento autónomo da informação transmitida por cada canal através da leitura da fotocorrente fornecida pelo dispositivo. O tema deste trabalho resulta das conclusões de trabalhos anteriores, em que este dispositivo e outros de configuração idêntica foram analisados, de forma a explorar a sua utilização na implementação da tecnologia WDM. Neste trabalho foram utilizados três canais de transmissão (Azul – 470 nm, Verde – 525 nm e Vermelho – 626 nm) e vários tipos de radiação de fundo. Foram realizadas medidas da resposta espectral e da resposta temporal da fotocorrente do dispositivo, em diferentes condições experimentais. Variou-se o comprimento de onda do canal e o comprimento de onda do fundo aplicado, mantendo-se constante a intensidade do canal e a frequência de transmissão. Os resultados obtidos permitiram aferir sobre a influência da presença da radiação de fundo e da tensão aplicada ao dispositivo, usando diferentes sequências de dados transmitidos nos vários canais. Verificou-se, que sob polarização inversa, a radiação de fundo vermelho amplifica os valores de fotocorrente do canal azul e a radiação de fundo azul amplifica o canal vermelho e verde. Para polarização directa, apenas a radiação de fundo azul amplifica os valores de fotocorrente do canal vermelho. Enquanto para ambas as polarizações, a radiação de fundo verde, não tem uma grande influência nos restantes canais. Foram implementados dois algoritmos para proceder ao reconhecimento da informação de cada canal. Na primeira abordagem usou-se a informação contida nas medidas de fotocorrente geradas pelo dispositivo sob polarização inversa e directa. Pela comparação das duas medidas desenvolveu-se e testou-se um algoritmo que permite o reconhecimento dos canais individuais. Numa segunda abordagem procedeu-se ao reconhecimento da informação de cada canal mas com aplicação de radiação de fundo, tendo-se usado a informação contida nas medidas de fotocorrente geradas pelo dispositivo sob polarização inversa sem aplicação de radiação de fundo com a informação contida nas medidas de fotocorrente geradas pelo dispositivo sob polarização inversa com aplicação de radiação de fundo. Pela comparação destas duas medidas desenvolveu-se e testou-se o segundo algoritmo que permite o reconhecimento dos canais individuais com base na aplicação de radiação de fundo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Results on the use of a double a-SiC:H p-i-n heterostructure for signal multiplexing and demultiplexing applications in the visible range are presented. Pulsed monochromatic beams together (multiplexing mode), or a single polychromatic beam (demultiplexing mode) impinge on the device and are absorbed, accordingly to their wavelength. Red, green and blue pulsed input channels are transmitted together, each one with a specific transmission rate. The combined optical signal is analyzed by reading out, under different applied voltages, the generated photocurrent. Results show that in the multiplexing mode the output signal is balanced by the wavelength and transmission rate of each input channel, keeping the memory of the incoming optical carriers. In the demultiplexing mode the photocurrent is controlled by the applied voltage allowing regaining the transmitted information. A physical model supported by a numerical simulation gives insight into the device operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preliminary version

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to establish some basic guidelines to help draft the information letter sent to individual contributors should it be decided to use this model in the Spanish public pension system. With this end in mind and basing our work on the experiences of the most advanced countries in the field and the pioneering papers by Jackson (2005), Larsson et al. (2008) and Sunden (2009), we look into the concept of “individual pension information” and identify its most relevant characteristics. We then give a detailed description of two models, those in the United States and Sweden, and in particular look at how they are structured, what aspects could be improved and what their limitations are. Finally we make some recommendations of special interest for designing the model for Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multilevel power converters have been introduced as the solution for high-power high-voltage switching applications where they have well-known advantages. Recently, full back-to-back connected multilevel neutral point diode clamped converters (NPC converter) have been used inhigh-voltage direct current (HVDC) transmission systems. Bipolar-connected back-to-back NPC converters have advantages in long-distance HVDCtransmission systems over the full back-to-back connection, but greater difficulty to balance the dc capacitor voltage divider on both sending and receiving end NPC converters. This study shows that power flow control and dc capacitor voltage balancing are feasible using fast optimum-predictive-based controllers in HVDC systems using bipolar back-to-back-connected five-level NPC multilevel converters. For both converter sides, the control strategytakes in account active and reactive power, which establishes ac grid currents in both ends, and guarantees the balancing of dc bus capacitor voltages inboth NPC converters. Additionally, the semiconductor switching frequency is minimised to reduce switching losses. The performance and robustness of the new fast predictive control strategy, and its capability to solve the DC capacitor voltage balancing problem of bipolar-connected back-to-back NPCconverters are evaluated.