20 resultados para One-shot information theory
Resumo:
This paper suggests that the thought of the North-American critical theorist James W. Carey provides a relevant perspective on communication and technology. Having as background American social pragmatism and progressive thinkers of the beginning of the 20th century (as Dewey, Mead, Cooley, and Park), Carey built a perspective that brought together the political economy of Harold A. Innis, the social criticism of David Riesman and Charles W. Mills and incorporated Marxist topics such as commodification and sociocultural domination. The main goal of this paper is to explore the connection established by Carey between modern technological communication and what he called the “transmissive model”, a model which not only reduces the symbolic process of communication to instrumentalization and to information delivery, but also politically converges with capitalism as well as power, control and expansionist goals. Conceiving communication as a process that creates symbolic and cultural systems, in which and through which social life takes place, Carey gives equal emphasis to the incorporation processes of communication.If symbolic forms and culture are ways of conditioning action, they are also influenced by technological and economic materializations of symbolic systems, and by other conditioning structures. In Carey’s view, communication is never a disembodied force; rather, it is a set of practices in which co-exist conceptions, techniques and social relations. These practices configure reality or, alternatively, can refute, transform and celebrate it. Exhibiting sensitiveness favourable to the historical understanding of communication, media and information technologies, one of the issues Carey explored most was the history of the telegraph as an harbinger of the Internet, of its problems and contradictions. For Carey, Internet was seen as the contemporary heir of the communications revolution triggered by the prototype of transmission technologies, namely the telegraph in the 19th century. In the telegraph Carey saw the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of conflict of interest for the control over patents; an inducer of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. After a brief approach to Carey’s communication theory, this paper focuses on his seminal essay "Technology and ideology. The case of the telegraph", bearing in mind the prospect of the communication revolution introduced by Internet. We maintain that this essay has seminal relevance for critically studying the information society. Our reading of it highlights the reach, as well as the problems, of an approach which conceives the innovation of the telegraph as a metaphor for all innovations, announcing the modern stage of history and determining to this day the major lines of development in modern communication systems.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
We discuss existence and multiplicity of positive solutions of the Dirichlet problem for the quasilinear ordinary differential equation-(u' / root 1 - u'(2))' = f(t, u). Depending on the behaviour of f = f(t, s) near s = 0, we prove the existence of either one, or two, or three, or infinitely many positive solutions. In general, the positivity of f is not required. All results are obtained by reduction to an equivalent non-singular problem to which variational or topological methods apply in a classical fashion.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
The development of children's school achievements in mathematics is one of the most important aims of education in Poland. The results of research concerning monitoring of school achievements in maths is not optimistic. We can observe low levels of children’s understanding of the merits of maths, self-developed strategies in solving problems and practical usage of maths skills. This article frames the discussion of this problem in its psychological and didactic context and analyses the causes as they relate to school practice in teaching maths
Resumo:
Mestrado em Contabilidade
Resumo:
Mestrado em Controlo e Gestão e dos Negócios
Resumo:
We present a study of the effects of nanoconfinement on a system of hard Gaussian overlap particles interacting with planar substrates through the hard-needle-wall potential, extending earlier work by two of us [D. J. Cleaver and P. I. C. Teixeira, Chem. Phys. Lett. 338, 1 (2001)]. Here, we consider the case of hybrid films, where one of the substrates induces strongly homeotropic anchoring, while the other favors either weakly homeotropic or planar anchoring. These systems are investigated using both Monte Carlo simulation and density-functional theory, the latter implemented at the level of Onsager's second-virial approximation with Parsons-Lee rescaling. The orientational structure is found to change either continuously or discontinuously depending on substrate separation, in agreement with earlier predictions by others. The theory is seen to perform well in spite of its simplicity, predicting the positional and orientational structure seen in simulations even for small particle elongations.
Resumo:
We investigate the influence of strong directional, or bonding, interactions on the phase diagram of complex fluids, and in particular on the liquid-vapour critical point. To this end we revisit a simple model and theory for associating fluids which consist of spherical particles having a hard-core repulsion, complemented by three short-ranged attractive sites on the surface (sticky spots). Two of the spots are of type A and one is of type B; the interactions between each pair of spots have strengths [image omitted], [image omitted] and [image omitted]. The theory is applied over the whole range of bonding strengths and results are interpreted in terms of the equilibrium cluster structures of the coexisting phases. In systems where unlike sites do not interact (i.e. where [image omitted]), the critical point exists all the way to [image omitted]. By contrast, when [image omitted], there is no critical point below a certain finite value of [image omitted]. These somewhat surprising results are rationalised in terms of the different network structures of the two systems: two long AA chains are linked by one BB bond (X-junction) in the former case, and by one AB bond (Y-junction) in the latter. The vapour-liquid transition may then be viewed as the condensation of these junctions and we find that X-junctions condense for any attractive [image omitted] (i.e. for any fraction of BB bonds), whereas condensation of the Y-junctions requires that [image omitted] be above a finite threshold (i.e. there must be a finite fraction of AB bonds).
Resumo:
Relatório Final de Estágio apresentado à Escola Superior de Dança, com vista à obtenção do grau de Mestre em Ensino de Dança.
Resumo:
Mestrado em Radioterapia
Resumo:
The dynamics of catalytic networks have been widely studied over the last decades because of their implications in several fields like prebiotic evolution, virology, neural networks, immunology or ecology. One of the most studied mathematical bodies for catalytic networks was initially formulated in the context of prebiotic evolution, by means of the hypercycle theory. The hypercycle is a set of self-replicating species able to catalyze other replicator species within a cyclic architecture. Hypercyclic organization might arise from a quasispecies as a way to increase the informational containt surpassing the so-called error threshold. The catalytic coupling between replicators makes all the species to behave like a single and coherent evolutionary multimolecular unit. The inherent nonlinearities of catalytic interactions are responsible for the emergence of several types of dynamics, among them, chaos. In this article we begin with a brief review of the hypercycle theory focusing on its evolutionary implications as well as on different dynamics associated to different types of small catalytic networks. Then we study the properties of chaotic hypercycles with error-prone replication with symbolic dynamics theory, characterizing, by means of the theory of topological Markov chains, the topological entropy and the periods of the orbits of unimodal-like iterated maps obtained from the strange attractor. We will focus our study on some key parameters responsible for the structure of the catalytic network: mutation rates, autocatalytic and cross-catalytic interactions.
Resumo:
Doutoramento em Economia Financeira e Contabilidade
Resumo:
Thesis to obtain the Master of Science Degree in Computer Science and Engineering