22 resultados para One-shot information theory
em CentAUR: Central Archive University of Reading - UK
Resumo:
This paper applies O3BPSK (orthogonal on-off PSK) signaling scheme to multipath fading CDMA channels, for the purpose of near-far resistant detection in the reverse link. Based on the maximum multipath spreading delay, a minimum duration of “off” is suggested, with which the temporally adjacent bits (TABs) from different users at the receiver are decoupled. As a result, a Rake-type one-shot linear decorrelating detector (LDD) is obtained. Since no knowledge of echo amplitudes is needed, a blind detection can be realised.
Resumo:
This paper proposes a new signaling scheme: orthogonal on-off BPSK (O3BPSK), for near-far resistant detection in the asynchronous DS/CDMA systems (up-link). The temporally adjacent bits from different users in the received signals are decoupled by using the on-off signaling, and the original data rate is maintained with no increase in transmission rate by adopting an orthogonal structure. The detector at the receiver is a one-shot linear decorrelating detector, which depends upon neither hard-decision nor specific channel coding. Some computer simulations are shown to confirm the theoretical analysis.
Resumo:
In this paper we consider a cooperative communication system where some a priori information of wireless channels is available at the transmitter. Several opportunistic relaying strategies are developed to fully utilize the available channel information. Then an explicit expression of the outage probability is developed for each proposed cooperative scheme as well as the diversity-multiplexing tradeoff by using order statistics. Our analytical results show that the more channel information available at the transmitter, the better performance a cooperative system can achieve. When the exact values of the source-relay channels are available, the performance loss at low SNR can be effectively suppressed. When the source node has the access to the source-relay and relay-destination channels, the full diversity can be achieved by costing only one extra channel used for relaying transmission, and an optimal diversity-multiplexing tradeoff can be achieved d(r) = (N + 1)(1 - 2r), where N is the number of all possible relaying nodes.
Resumo:
This text contains papers presented at the Institute of Mathematics and its Applications Conference on Control Theory, held at the University of Strathclyde in Glasgow. The contributions cover a wide range of topics of current interest to theoreticians and practitioners including algebraic systems theory, nonlinear control systems, adaptive control, robustness issues, infinite dimensional systems, applications studies and connections to mathematical aspects of information theory and data-fusion.
Resumo:
Purpose – This paper seeks to examine the nature of “service innovation” in the facilities management (FM) context. It reviews recent thinking on “service innovation” as distinct from “product innovation”. Applying these contemporary perspectives it describes UK case studies of 11 innovations in different FM organisations. These include both in-house client-based innovations and third-party innovations. Design/methodology/approach – The study described in the paper encompasses 11 different innovations that constitute a mix of process, product and practice innovations. All of the innovations stem from UK-based organisations that were subject to in-depth interviews regarding the identification, screening, commitment of resources and implementation of the selected innovations. Findings – The research suggested that service innovation is highly active in the UK FM sector. However, the process of innovation rarely followed a common formalized path. Generally, the innovations were one-shot commitments at the early stage. None of the innovations studied failed to proceed to full adoption stage. This was either due to the reluctance of participating organisations to volunteer “tested but unsuccessful” innovations or the absence of any trial methods that might have exposed an innovations shortcomings. Research limitations/implications – The selection of innovations was restricted to the UK context. Moreover, the choice of innovations was partly determined by the innovating organisation. This selection process appeared to emphasise “one-shot” high profile technological innovations, typically associated with software. This may have been at the expense of less resource intensive, bottom-up innovations. Practical implications – This paper suggests that there is a role for “research and innovation” teams within larger FM organisations, whether they are client-based or third-party. Central to this philosophy is an approach that is open to the possibility of failure. The innovations studied were risk averse with a firm commitment to proceed at the early stage. Originality/value – This paper introduces new thinking on the subject of “service innovation” to the context of FM. It presents research and development as a planned solution to innovation. This approach will enable service organisations to fully test and exploit service innovations.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
This paper proposes a convenient signaling scheme-orthogonal on-off BPSK (O3BPSK)-for near-far (NF) resistant detection in asynchronous direct-sequence code-division multiple-access (DS/CDMA) systems (uplink). The temporally adjacent bits from different users in the received signals are decoupled by using the on-off signaling, and the original data rate is maintained with no increase in transmission rate by adopting an orthogonal structure. The detector at the receiver is a one-shot linear decorrelating detector, which depends upon neither hard decision nor specific channel coding. The application of O3 strategy to the differentially encoded BPSK (D-BPSK) sequences is also presented. Finally, some computer simulations are shown to confirm the theoretical analysis.
Resumo:
This paper addresses the effects of synchronisation errors (time delay, carrier phase, and carrier frequency) on the performance of linear decorrelating detectors (LDDs). A major effect is that all LDDs require certain degree of power control in the presence of synchronisation errors. The multi-shot sliding window algorithm (SLWA) and hard decision method (HDM) are analysed and their power control requirements are examined. Also, a more efficient one-shot detection scheme, called “hard-decision based coupling cancellation”, is proposed and analysed. These schemes are then compared with the isolation bit insertion (IBI) approach in terms of power control requirements.
Resumo:
It has been years since the introduction of the Dynamic Network Optimization (DNO) concept, yet the DNO development is still at its infant stage, largely due to a lack of breakthrough in minimizing the lengthy optimization runtime. Our previous work, a distributed parallel solution, has achieved a significant speed gain. To cater for the increased optimization complexity pressed by the uptake of smartphones and tablets, however, this paper examines the potential areas for further improvement and presents a novel asynchronous distributed parallel design that minimizes the inter-process communications. The new approach is implemented and applied to real-life projects whose results demonstrate an augmented acceleration of 7.5 times on a 16-core distributed system compared to 6.1 of our previous solution. Moreover, there is no degradation in the optimization outcome. This is a solid sprint towards the realization of DNO.
Resumo:
This study suggests a statistical strategy for explaining how food purchasing intentions are influenced by different levels of risk perception and trust in food safety information. The modelling process is based on Ajzen's Theory of Planned Behaviour and includes trust and risk perception as additional explanatory factors. Interaction and endogeneity across these determinants is explored through a system of simultaneous equations, while the SPARTA equation is estimated through an ordered probit model. Furthermore, parameters are allowed to vary as a function of socio-demographic variables. The application explores chicken purchasing intentions both in a standard situation and conditional to an hypothetical salmonella scare. Data were collected through a nationally representative UK wide survey of 533 UK respondents in face-to-face, in-home interviews. Empirical findings show that interactions exist among the determinants of planned behaviour and socio-demographic variables improve the model's performance. Attitudes emerge as the key determinant of intention to purchase chicken, while trust in food safety information provided by media reduces the likelihood to purchase. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Another Proof of the Preceding Theory was produced as part of a residency run by Artists in Archeology in conjunction with the Stonehenge Riverside project. The film explores the relationship between science, work and ritual, imagining archaeology as a future cult. As two robed disciples stray off from the dig, they are drawn to the drone of the stones and proceed to play the henge like a gigantic Theremin. Just as a Theremin is played with the hand interfering in an electric circuit and producing sound without contact, so the stones respond to the choreographed bodily proximity. Finally, one of the two continues alone to the avenue at Avebury, where the magnetic pull of the stones reaches its climax. Shot on VHS, the film features a score by Zuzushi Monkey, with percussion and theremin sounds mirroring the action. The performers are mostly artists and archeologists from the art and archaeology teams. The archeologists were encouraged to perform their normal work in the robes, in an attempt to explore the meeting points of science and ritual and interrogate our relationship to an ultimately unknowable prehistoric past where activities we do not understand are relegated to the realm of religion. Stonehenge has unique acoustic properties, it’s large sarsen stones are finely worked on the inside, left rough on the outside, intensifying sound waves within the inner horseshoe, but since their real use, having been built over centuries, remains ambiguous, the film proposes that our attempts to decode them may themselves become encoded in their cumulative meaning for future researchers.