43 resultados para One-shot information theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the analytical description of a specific nonlinear fibre channel with non-compensated dispersion. Information theory analysis shows that capacity of such nonlinear fibre channel does not decay with growing signal-to-noise ratio. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A closed-form expression for a lower bound on the per soliton capacity of the nonlinear optical fibre channel in the presence of (optical) amplifier spontaneous emission (ASE) noise is derived. This bound is based on a non-Gaussian conditional probability density function for the soliton amplitude jitter induced by the ASE noise and is proven to grow logarithmically as the signal-to-noise ratio increases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erasure control coding has been exploited in communication networks with an aim to improve the end-to-end performance of data delivery across the network. To address the concerns over the strengths and constraints of erasure coding schemes in this application, we examine the performance limits of two erasure control coding strategies, forward erasure recovery and adaptive erasure recovery. Our investigation shows that the throughput of a network using an (n, k) forward erasure control code is capped by r =k/n when the packet loss rate p ≤ (te/n) and by k(l-p)/(n-te) when p > (t e/n), where te is the erasure control capability of the code. It also shows that the lower bound of the residual loss rate of such a network is (np-te)/(n-te) for (te/n) < p ≤ 1. Especially, if the code used is maximum distance separable, the Shannon capacity of the erasure channel, i.e. 1-p, can be achieved and the residual loss rate is lower bounded by (p+r-1)/r, for (1-r) < p ≤ 1. To address the requirements in real-time applications, we also investigate the service completion time of different schemes. It is revealed that the latency of the forward erasure recovery scheme is fractionally higher than that of the scheme without erasure control coding or retransmission mechanisms (using UDP), but much lower than that of the adaptive erasure scheme when the packet loss rate is high. Results on comparisons between the two erasure control schemes exhibit their advantages as well as disadvantages in the role of delivering end-to-end services. To show the impact of the bounds derived on the end-to-end performance of a TCP/IP network, a case study is provided to demonstrate how erasure control coding could be used to maximize the performance of practical systems. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clusters of temporal optical solitons—stable self-localized light pulses preserving their form during propagation—exhibit properties characteristic of that encountered in crystals. Here, we introduce the concept of temporal solitonic information crystals formed by the lattices of optical pulses with variable phases. The proposed general idea offers new approaches to optical coherent transmission technology and can be generalized to dispersion-managed and dissipative solitons as well as scaled to a variety of physical platforms from fiber optics to silicon chips. We discuss the key properties of such dynamic temporal crystals that mathematically correspond to non-Hermitian lattices and examine the types of collective mode instabilities determining the lifetime of the soliton train. This transfer of techniques and concepts from solid state physics to information theory promises a new outlook on information storage and transmission.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been suggested that, in order to maintain its relevance, critical research must develop a strong emphasis on empirical work rather than the conceptual emphasis that has typically characterized critical scholarship in management. A critical project of this nature is applicable in the information systems (IS) arena, which has a growing tradition of qualitative inquiry. Despite its relativist ontology, actor–network theory places a strong emphasis on empirical inquiry and this paper argues that actor–network theory, with its careful tracing and recording of heterogeneous networks, is well suited to the generation of detailed and contextual empirical knowledge about IS. The intention in this paper is to explore the relevance of IS research informed by actor–network theory in the pursuit of a broader critical research project as de? ned in earlier work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper will outline a research methodology informed by theorists who have contributed to actor network theory (ANT). Research informed from such a perspective recognizes the constitutive role of accounting systems in the achievement of broader social goals. Latour, Knoor Cetina and others argue that the bringing in of non-human actants, through the growth of technology and science, has added immeasurably to the complexity of modern society. The paper ‘sees’ accounting and accounting systems as being constituted by technological ‘black boxes’ and seeks to discuss two questions. One concerns the processes which surround the establishment of ‘facts’, i.e. how ‘black boxes’ are created or accepted (even if temporarily) within society. The second concerns the role of existing ‘black boxes’ within society and organizations. Accounting systems not only promote a particular view of the activities of an organization or a subunit, but in their very implementation and operation ‘mobilize’ other organizational members in a particular direction. The implications of such an interpretation are explored in this paper. Firstly through a discussion of some of the theoretic constructs that have been proposed to frame ANT research. Secondly an attempt is made to relate some of these ideas to aspects of the empirics in a qualitative case study. The case site is in the health sector and involves the implementation of a casemix accounting system. Evidence from the case research is used to exemplify aspects of the theoretical constructs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT This study is about leadership in American Evangelical Churches, which as a sub-set of American Christianity, are growing, while American Christianity as a whole is in decline. As a result evangelicalism is quickly becoming the dominate iteration of American Christianity. It is anecdotal that well led churches grow while poorly led churches do not, yet no one has identified what leadership, in the evangelical church context, is. Researchers have investigated a number of aspects of church leadership (much of it without identifying whether or not the churches under investigation were evangelical or not) but no one has put forth a unified theory linking these aspects together. The purpose of this research is to address that gap and develop a theory that explains how evangelicals view leadership in their local churches. In this study of three churches, dissimilar in size and governance, a purely qualitative approach to data collection and analysis was employed. The study involved 60 interviews that sought points-of-view from top and mid-level leadership along with congregant followers. The study borrowed heavily from Glaser and Strauss (1967) Grounded Theory approach to data analysis. The results developed a theory which provides a unified explanation of how leadership actually works in the three evangelical churches. Several implications for practice are discussed as to the theory's usefulness as a method of leadership education and evaluation. An original discovery was found that an individual's incumbency within the organization was identified as a social power. Limitations to this research are the limitations generally imputed to purely qualitative research in that questions are raised about the theory's applicability to evangelical churches beyond the three studied. The suggestions for further research involve addressing those limitations

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper proposes an ISE (Information goal, Search strategy, Evaluation threshold) user classification model based on Information Foraging Theory for understanding user interaction with content-based image retrieval (CBIR). The proposed model is verified by a multiple linear regression analysis based on 50 users' interaction features collected from a task-based user study of interactive CBIR systems. To our best knowledge, this is the first principled user classification model in CBIR verified by a formal and systematic qualitative analysis of extensive user interaction data. Copyright 2010 ACM.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a review of the latest developments in one-dimensional (1D) optical wave turbulence (OWT). Based on an original experimental setup that allows for the implementation of 1D OWT, we are able to show that an inverse cascade occurs through the spontaneous evolution of the nonlinear field up to the point when modulational instability leads to soliton formation. After solitons are formed, further interaction of the solitons among themselves and with incoherent waves leads to a final condensate state dominated by a single strong soliton. Motivated by the observations, we develop a theoretical description, showing that the inverse cascade develops through six-wave interaction, and that this is the basic mechanism of nonlinear wave coupling for 1D OWT. We describe theory, numerics and experimental observations while trying to incorporate all the different aspects into a consistent context. The experimental system is described by two coupled nonlinear equations, which we explore within two wave limits allowing for the expression of the evolution of the complex amplitude in a single dynamical equation. The long-wave limit corresponds to waves with wave numbers smaller than the electrical coherence length of the liquid crystal, and the opposite limit, when wave numbers are larger. We show that both of these systems are of a dual cascade type, analogous to two-dimensional (2D) turbulence, which can be described by wave turbulence (WT) theory, and conclude that the cascades are induced by a six-wave resonant interaction process. WT theory predicts several stationary solutions (non-equilibrium and thermodynamic) to both the long- and short-wave systems, and we investigate the necessary conditions required for their realization. Interestingly, the long-wave system is close to the integrable 1D nonlinear Schrödinger equation (NLSE) (which contains exact nonlinear soliton solutions), and as a result during the inverse cascade, nonlinearity of the system at low wave numbers becomes strong. Subsequently, due to the focusing nature of the nonlinearity, this leads to modulational instability (MI) of the condensate and the formation of solitons. Finally, with the aid of the probability density function (PDF) description of WT theory, we explain the coexistence and mutual interactions between solitons and the weakly nonlinear random wave background in the form of a wave turbulence life cycle (WTLC).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.