54 resultados para predictive coding
Resumo:
In this paper, we demonstrate through computer simulation and experiment a novel subcarrier coding scheme combined with pre-electrical dispersion compensation (pre-EDC) for fiber nonlinearity mitigation in coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. As the frequency spacing in CO-OFDM systems is usually small (tens of MHz), neighbouring subcarriers tend to experience correlated nonlinear distortions after propagation over a fiber link. As a consequence, nonlinearity mitigation can be achieved by encoding and processing neighbouring OFDM subcarriers simultaneously. Herein, we propose to adopt the concept of dual phase conjugated twin wave for CO-OFDM transmission. Simulation and experimental results show that this simple technique combined with 50% pre-EDC can effectively offer up to 1.5 and 0.8 dB performance gains in CO-OFDM systems with BPSK and QPSK modulation formats, respectively.
Resumo:
The acceleration of solid dosage form product development can be facilitated by the inclusion of excipients that exhibit poly-/multi-functionality with reduction of the time invested in multiple excipient optimisations. Because active pharmaceutical ingredients (APIs) and tablet excipients present diverse densification behaviours upon compaction, the involvement of these different powders during compaction makes the compaction process very complicated. The aim of this study was to assess the macrometric characteristics and distribution of surface charges of two powders: indomethacin (IND) and arginine (ARG); and evaluate their impact on the densification properties of the two powders. Response surface modelling (RSM) was employed to predict the effect of two independent variables; Compression pressure (F) and ARG percentage (R) in binary mixtures on the properties of resultant tablets. The study looked at three responses namely; porosity (P), tensile strength (S) and disintegration time (T). Micrometric studies showed that IND had a higher charge density (net charge to mass ratio) when compared to ARG; nonetheless, ARG demonstrated good compaction properties with high plasticity (Y=28.01MPa). Therefore, ARG as filler to IND tablets was associated with better mechanical properties of the tablets (tablet tensile strength (σ) increased from 0.2±0.05N/mm2 to 2.85±0.36N/mm2 upon adding ARG at molar ratio of 8:1 to IND). Moreover, tablets' disintegration time was shortened to reach few seconds in some of the formulations. RSM revealed tablet porosity to be affected by both compression pressure and ARG ratio for IND/ARG physical mixtures (PMs). Conversely, the tensile strength (σ) and disintegration time (T) for the PMs were influenced by the compression pressure, ARG ratio and their interactive term (FR); and a strong correlation was observed between the experimental results and the predicted data for tablet porosity. This work provides clear evidence of the multi-functionality of ARG as filler, binder and disintegrant for directly compressed tablets.
Resumo:
GitHub is the most popular repository for open source code (Finley 2011). It has more than 3.5 million users, as the company declared in April 2013, and more than 10 million repositories, as of December 2013. It has a publicly accessible API and, since March 2012, it also publishes a stream of all the events occurring on public projects. Interactions among GitHub users are of a complex nature and take place in different forms. Developers create and fork repositories, push code, approve code pushed by others, bookmark their favorite projects and follow other developers to keep track of their activities. In this paper we present a characterization of GitHub, as both a social network and a collaborative platform. To the best of our knowledge, this is the first quantitative study about the interactions happening on GitHub. We analyze the logs from the service over 18 months (between March 11, 2012 and September 11, 2013), describing 183.54 million events and we obtain information about 2.19 million users and 5.68 million repositories, both growing linearly in time. We show that the distributions of the number of contributors per project, watchers per project and followers per user show a power-law-like shape. We analyze social ties and repository-mediated collaboration patterns, and we observe a remarkably low level of reciprocity of the social connections. We also measure the activity of each user in terms of authored events and we observe that very active users do not necessarily have a large number of followers. Finally, we provide a geographic characterization of the centers of activity and we investigate how distance influences collaboration.
Resumo:
We propose a new approach for secret key exchange involving the variation of the cavity length of an ultra-long fibre laser. The scheme is based on the realisation that the free spectral range of the laser cavity can be used as an information carrier. We present a proof-of-principle demonstration of this new concept using a 50-km-long fibre laser to link two users, both of whom can randomly add an extra 1-km-long fibre segment.
Resumo:
A novel simulation model for pyrolysis processes oflignocellulosicbiomassin AspenPlus (R) was presented at the BC&E 2013. Based on kinetic reaction mechanisms, the simulation calculates product compositions and yields depending on reactor conditions (temperature, residence time, flue gas flow rate) and feedstock composition (biochemical composition, atomic composition, ash and alkali metal content). The simulation model was found to show good correlation with existing publications. In order to further verify the model, own pyrolysis experiments in a 1 kg/h continuously fed fluidized bed fast pyrolysis reactor are performed. Two types of biomass with different characteristics are processed in order to evaluate the influence of the feedstock composition on the yields of the pyrolysis products and their composition. One wood and one straw-like feedstock are used due to their different characteristics. Furthermore, the temperature response of yields and product compositions is evaluated by varying the reactor temperature between 450 and 550 degrees C for one of the feedstocks. The yields of the pyrolysis products (gas, oil, char) are determined and their detailed composition is analysed. The experimental runs are reproduced with the corresponding reactor conditions in the AspenPlus model and the results compared with the experimental findings.
Resumo:
Purpose – The purpose of this paper is to examine challenges and potential of big data in heterogeneous business networks and relate these to an implemented logistics solution. Design/methodology/approach – The paper establishes an overview of challenges and opportunities of current significance in the area of big data, specifically in the context of transparency and processes in heterogeneous enterprise networks. Within this context, the paper presents how existing components and purpose-driven research were combined for a solution implemented in a nationwide network for less-than-truckload consignments. Findings – Aside from providing an extended overview of today’s big data situation, the findings have shown that technical means and methods available today can comprise a feasible process transparency solution in a large heterogeneous network where legacy practices, reporting lags and incomplete data exist, yet processes are sensitive to inadequate policy changes. Practical implications – The means introduced in the paper were found to be of utility value in improving process efficiency, transparency and planning in logistics networks. The particular system design choices in the presented solution allow an incremental introduction or evolution of resource handling practices, incorporating existing fragmentary, unstructured or tacit knowledge of experienced personnel into the theoretically founded overall concept. Originality/value – The paper extends previous high-level view on the potential of big data, and presents new applied research and development results in a logistics application.
Resumo:
This paper will look at the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP). FEC can be used to reduce the number of retransmissions which would usually result from a lost packet. The requirement for TCP to deal with any losses is then greatly reduced. There are however side-effects to using FEC as a countermeasure to packet loss: an additional requirement for bandwidth. When applications such as real-time video conferencing are needed, delay must be kept to a minimum, and retransmissions are certainly not desirable. A balance, therefore, between additional bandwidth and delay due to retransmissions must be struck. Our results show that the throughput of data can be significantly improved when packet loss occurs using a combination of FEC and TCP, compared to relying solely on TCP for retransmissions. Furthermore, a case study applies the result to demonstrate the achievable improvements in the quality of streaming video perceived by end users.
Resumo:
Erasure control coding has been exploited in communication networks with an aim to improve the end-to-end performance of data delivery across the network. To address the concerns over the strengths and constraints of erasure coding schemes in this application, we examine the performance limits of two erasure control coding strategies, forward erasure recovery and adaptive erasure recovery. Our investigation shows that the throughput of a network using an (n, k) forward erasure control code is capped by r =k/n when the packet loss rate p ≤ (te/n) and by k(l-p)/(n-te) when p > (t e/n), where te is the erasure control capability of the code. It also shows that the lower bound of the residual loss rate of such a network is (np-te)/(n-te) for (te/n) < p ≤ 1. Especially, if the code used is maximum distance separable, the Shannon capacity of the erasure channel, i.e. 1-p, can be achieved and the residual loss rate is lower bounded by (p+r-1)/r, for (1-r) < p ≤ 1. To address the requirements in real-time applications, we also investigate the service completion time of different schemes. It is revealed that the latency of the forward erasure recovery scheme is fractionally higher than that of the scheme without erasure control coding or retransmission mechanisms (using UDP), but much lower than that of the adaptive erasure scheme when the packet loss rate is high. Results on comparisons between the two erasure control schemes exhibit their advantages as well as disadvantages in the role of delivering end-to-end services. To show the impact of the bounds derived on the end-to-end performance of a TCP/IP network, a case study is provided to demonstrate how erasure control coding could be used to maximize the performance of practical systems. © 2010 IEEE.
Resumo:
We quantify the error statistics and patterning effects in a 5x 40 Gbit/s WDM RZ-DBPSK SMF/DCF fibre link using hybrid Raman/EDFA amplification. We propose an adaptive constrained coding for the suppression of errors due to patterning effects. It is established, that this coding technique can greatly reduce the bit error rate (BER) value even for large BER (BER > 101). The proposed approach can be used in the combination with the forward error correction schemes (FEC) to correct the errors even when real channel BER is outside the FEC workspace.