51 resultados para scalable coding
Resumo:
We propose a 2R regeneration scheme based on a nonlinear optical loop mirror and optical filtering. The feasibility of wavelength-division multiplexing operation at 40 Gbit/s is numerically demonstrated. We examine the characteristics of one-step regeneration and discuss networking applications.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
To guarantee QoS for multicast transmission, admission control for multicast sessions is expected. Probe-based multicast admission control (PBMAC) scheme is a scalable and simple approach. However, PBMAC suffers from the subsequent request problem which can significantly reduce the maximum number of multicast sessions that a network can admit. In this letter, we describe the subsequent request problem and propose an enhanced PBMAC scheme to solve this problem. The enhanced scheme makes use of complementary probing and remarking which require only minor modification to the original scheme. By using a fluid-based analytical model, we are able to prove that the enhanced scheme can always admit a higher number of multicast sessions. Furthermore, we present validation of the analytical model using packet based simulation. Copyright © 2005 The Institute of Electronics, Information and Communication Engineers.
Resumo:
We present information-theory analysis of the tradeoff between bit-error rate improvement and the data-rate loss using skewed channel coding to suppress pattern-dependent errors in digital communications. Without loss of generality, we apply developed general theory to the particular example of a high-speed fiber communication system with a strong patterning effect. © 2007 IEEE.
Resumo:
Distributed source coding (DSC) has recently been considered as an efficient approach to data compression in wireless sensor networks (WSN). Using this coding method multiple sensor nodes compress their correlated observations without inter-node communications. Therefore energy and bandwidth can be efficiently saved. In this paper, we investigate a randombinning based DSC scheme for remote source estimation in WSN and its performance of estimated signal to distortion ratio (SDR). With the introduction of a detailed power consumption model for wireless sensor communications, we quantitatively analyze the overall network energy consumption of the DSC scheme. We further propose a novel energy-aware transmission protocol for the DSC scheme, which flexibly optimizes the DSC performance in terms of either SDR or energy consumption, by adapting the source coding and transmission parameters to the network conditions. Simulations validate the energy efficiency of the proposed adaptive transmission protocol. © 2007 IEEE.
Resumo:
We have investigated how optimal coding for neural systems changes with the time available for decoding. Optimization was in terms of maximizing information transmission. We have estimated the parameters for Poisson neurons that optimize Shannon transinformation with the assumption of rate coding. We observed a hierarchy of phase transitions from binary coding, for small decoding times, toward discrete (M-ary) coding with two, three and more quantization levels for larger decoding times. We postulate that the presence of subpopulations with specific neural characteristics could be a signiture of an optimal population coding scheme and we use the mammalian auditory system as an example.
Resumo:
We demonstrate a novel subcarrier coding scheme combined with pre-EDC for fibre nonlinearity mitigation in CO-OFDM, showing that a performance improvement of 1.5 dB can be achieved in a 150 Gb/s BPSK PDM CO-OFDM transmission.
Resumo:
We propose a new approach to the generation of an alphabet for secret key exchange relying on small variations in the cavity length of an ultra-long fiber laser. This new concept is supported by experimental results showing how the radio-frequency spectrum of the laser can be exploited as a carrier to exchange information. The test bench for our proof of principle is a 50 km-long fiber laser linking two users, Alice and Bob, where each user can randomly add an extra 1 km-long segment of fiber. The choice of laser length is driven by two independent random binary values, which makes such length become itself a random variable. The security of key exchange is ensured whenever the two independent random choices lead to the same laser length and, hence, to the same free spectral range.
Resumo:
The use of hMSCs for allogeneic therapies requiring lot sizes of billions of cells will necessitate large-scale culture techniques such as the expansion of cells on microcarriers in bioreactors. Whilst much research investigating hMSC culture on microcarriers has focused on growth, much less involves their harvesting for passaging or as a step towards cryopreservation and storage. A successful new harvesting method has recently been outlined for cells grown on SoloHill microcarriers in a 5L bioreactor [1]. Here, this new method is set out in detail, harvesting being defined as a two-step process involving cell 'detachment' from the microcarriers' surface followed by the 'separation' of the two entities. The new detachment method is based on theoretical concepts originally developed for secondary nucleation due to agitation. Based on this theory, it is suggested that a short period (here 7min) of intense agitation in the presence of a suitable enzyme should detach the cells from the relatively large microcarriers. In addition, once detached, the cells should not be damaged because they are smaller than the Kolmogorov microscale. Detachment was then successfully achieved for hMSCs from two different donors using microcarrier/cell suspensions up to 100mL in a spinner flask. In both cases, harvesting was completed by separating cells from microcarriers using a Steriflip® vacuum filter. The overall harvesting efficiency was >95% and after harvesting, the cells maintained all the attributes expected of hMSC cells. The underlying theoretical concepts suggest that the method is scalable and this aspect is discussed too. © 2014 The Authors.
Resumo:
We present experimental results on a 50km fiber laser switching among four different values of the free-spectral range for possible applications in secure key-distribution. © 2014 OSA.
Resumo:
In this paper, we demonstrate through computer simulation and experiment a novel subcarrier coding scheme combined with pre-electrical dispersion compensation (pre-EDC) for fiber nonlinearity mitigation in coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. As the frequency spacing in CO-OFDM systems is usually small (tens of MHz), neighbouring subcarriers tend to experience correlated nonlinear distortions after propagation over a fiber link. As a consequence, nonlinearity mitigation can be achieved by encoding and processing neighbouring OFDM subcarriers simultaneously. Herein, we propose to adopt the concept of dual phase conjugated twin wave for CO-OFDM transmission. Simulation and experimental results show that this simple technique combined with 50% pre-EDC can effectively offer up to 1.5 and 0.8 dB performance gains in CO-OFDM systems with BPSK and QPSK modulation formats, respectively.
Resumo:
GitHub is the most popular repository for open source code (Finley 2011). It has more than 3.5 million users, as the company declared in April 2013, and more than 10 million repositories, as of December 2013. It has a publicly accessible API and, since March 2012, it also publishes a stream of all the events occurring on public projects. Interactions among GitHub users are of a complex nature and take place in different forms. Developers create and fork repositories, push code, approve code pushed by others, bookmark their favorite projects and follow other developers to keep track of their activities. In this paper we present a characterization of GitHub, as both a social network and a collaborative platform. To the best of our knowledge, this is the first quantitative study about the interactions happening on GitHub. We analyze the logs from the service over 18 months (between March 11, 2012 and September 11, 2013), describing 183.54 million events and we obtain information about 2.19 million users and 5.68 million repositories, both growing linearly in time. We show that the distributions of the number of contributors per project, watchers per project and followers per user show a power-law-like shape. We analyze social ties and repository-mediated collaboration patterns, and we observe a remarkably low level of reciprocity of the social connections. We also measure the activity of each user in terms of authored events and we observe that very active users do not necessarily have a large number of followers. Finally, we provide a geographic characterization of the centers of activity and we investigate how distance influences collaboration.
Resumo:
We propose a new approach for secret key exchange involving the variation of the cavity length of an ultra-long fibre laser. The scheme is based on the realisation that the free spectral range of the laser cavity can be used as an information carrier. We present a proof-of-principle demonstration of this new concept using a 50-km-long fibre laser to link two users, both of whom can randomly add an extra 1-km-long fibre segment.
Resumo:
Computational and communication complexities call for distributed, robust, and adaptive control. This paper proposes a promising way of bottom-up design of distributed control in which simple controllers are responsible for individual nodes. The overall behavior of the network can be achieved by interconnecting such controlled loops in cascade control for example and by enabling the individual nodes to share information about data with their neighbors without aiming at unattainable global solution. The problem is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, that can be implemented adaptively and which provide a systematic rich way to information sharing. This paper elaborates the overall solution, applies it to linear-Gaussian case, and provides simulation results.