272 resultados para distributed functional observers
Resumo:
The q-Gaussian distribution results from maximizing certain generalizations of Shannon entropy under some constraints. The importance of q-Gaussian distributions stems from the fact that they exhibit power-law behavior, and also generalize Gaussian distributions. In this paper, we propose a Smoothed Functional (SF) scheme for gradient estimation using q-Gaussian distribution, and also propose an algorithm for optimization based on the above scheme. Convergence results of the algorithm are presented. Performance of the proposed algorithm is shown by simulation results on a queuing model.
Resumo:
Glycosyl hydrolase family 1 beta-glucosidases are important enzymes that serve many diverse functions in plants including defense, whereby hydrolyzing the defensive compounds such as hydroxynitrile glucosides. A hydroxynitrile glucoside cleaving beta-glucosidase gene (Llbglu1) was isolated from Leucaena leucocephala, cloned into pET-28a (+) and expressed in E. coli BL21 (DE3) cells. The recombinant enzyme was purified by Ni-NTA affinity chromatography. The optimal temperature and pH for this beta-glucosidase were found to be 45 A degrees C and 4.8, respectively. The purified Llbglu1 enzyme hydrolyzed the synthetic glycosides, pNPGlucoside (pNPGlc) and pNPGalactoside (pNPGal). Also, the enzyme hydrolyzed amygdalin, a hydroxynitrile glycoside and a few of the tested flavonoid and isoflavonoid glucosides. The kinetic parameters K (m) and V (max) were found to be 38.59 mu M and 0.8237 mu M/mg/min for pNPGlc, whereas for pNPGal the values were observed as 1845 mu M and 0.1037 mu M/mg/min. In the present study, a three dimensional (3D) model of the Llbglu1 was built by MODELLER software to find out the substrate binding sites and the quality of the model was examined using the program PROCHEK. Docking studies indicated that conserved active site residues are Glu 199, Glu 413, His 153, Asn 198, Val 270, Asn 340, and Trp 462. Docking of rhodiocyanoside A with the modeled Llbglu1 resulted in a binding with free energy change (Delta G) of -5.52 kcal/mol on which basis rhodiocyanoside A could be considered as a potential substrate.
Resumo:
Clock synchronisation is an important requirement for various applications in wireless sensor networks (WSNs). Most of the existing clock synchronisation protocols for WSNs use some hierarchical structure that introduces an extra overhead due to the dynamic nature of WSNs. Besides, it is difficult to integrate these clock synchronisation protocols with sleep scheduling scheme, which is a major technique to conserve energy. In this paper, we propose a fully distributed peer-to-peer based clock synchronisation protocol, named Distributed Clock Synchronisation Protocol (DCSP), using a novel technique of pullback for complete sensor networks. The pullback technique ensures that synchronisation phases of any pair of clocks always overlap. We have derived an exact expression for a bound on maximum synchronisation error in the DCSP protocol, and simulation study verifies that it is indeed less than the computed upper bound. Experimental study using a few TelosB motes also verifies that the pullback occurs as predicted.
Resumo:
We consider the wireless two-way relay channel, in which two-way data transfer takes place between the end nodes with the help of a relay. For the Denoise-And-Forward (DNF) protocol, it was shown by Koike-Akino et al. that adaptively changing the network coding map used at the relay greatly reduces the impact of Multiple Access Interference at the relay. The harmful effect of the deep channel fade conditions can be effectively mitigated by proper choice of these network coding maps at the relay. Alternatively, in this paper we propose a Distributed Space Time Coding (DSTC) scheme, which effectively removes most of the deep fade channel conditions at the transmitting nodes itself without any CSIT and without any need to adaptively change the network coding map used at the relay. It is shown that the deep fades occur when the channel fade coefficient vector falls in a finite number of vector subspaces of, which are referred to as the singular fade subspaces. DSTC design criterion referred to as the singularity minimization criterion under which the number of such vector subspaces are minimized is obtained. Also, a criterion to maximize the coding gain of the DSTC is obtained. Explicit low decoding complexity DSTC designs which satisfy the singularity minimization criterion and maximize the coding gain for QAM and PSK signal sets are provided. Simulation results show that at high Signal to Noise Ratio, the DSTC scheme provides large gains when compared to the conventional Exclusive OR network code and performs better than the adaptive network coding scheme.
Resumo:
Regenerating codes are a class of codes for distributed storage networks that provide reliability and availability of data, and also perform efficient node repair. Another important aspect of a distributed storage network is its security. In this paper, we consider a threat model where an eavesdropper may gain access to the data stored in a subset of the storage nodes, and possibly also, to the data downloaded during repair of some nodes. We provide explicit constructions of regenerating codes that achieve information-theoretic secrecy capacity in this setting.
Resumo:
Opportunistic selection is a practically appealing technique that is used in multi-node wireless systems to maximize throughput, implement proportional fairness, etc. However, selection is challenging since the information about a node's channel gains is often available only locally at each node and not centrally. We propose a novel multiple access-based distributed selection scheme that generalizes the best features of the timer scheme, which requires minimal feedback but does not always guarantee successful selection, and the fast splitting scheme, which requires more feedback but guarantees successful selection. The proposed scheme's design explicitly accounts for feedback time overheads unlike the conventional splitting scheme and guarantees selection of the user with the highest metric unlike the timer scheme. We analyze and minimize the average time including feedback required by the scheme to select. With feedback overheads, the proposed scheme is scalable and considerably faster than several schemes proposed in the literature. Furthermore, the gains increase as the feedback overhead increases.
Resumo:
This paper considers sequential hypothesis testing in a decentralized framework. We start with two simple decentralized sequential hypothesis testing algorithms. One of which is later proved to be asymptotically Bayes optimal. We also consider composite versions of decentralized sequential hypothesis testing. A novel nonparametric version for decentralized sequential hypothesis testing using universal source coding theory is developed. Finally we design a simple decentralized multihypothesis sequential detection algorithm.
Resumo:
In this paper, we consider a distributed function computation setting, where there are m distributed but correlated sources X1,...,Xm and a receiver interested in computing an s-dimensional subspace generated by [X1,...,Xm]Γ for some (m × s) matrix Γ of rank s. We construct a scheme based on nested linear codes and characterize the achievable rates obtained using the scheme. The proposed nested-linear-code approach performs at least as well as the Slepian-Wolf scheme in terms of sum-rate performance for all subspaces and source distributions. In addition, for a large class of distributions and subspaces, the scheme improves upon the Slepian-Wolf approach. The nested-linear-code scheme may be viewed as uniting under a common framework, both the Korner-Marton approach of using a common linear encoder as well as the Slepian-Wolf approach of employing different encoders at each source. Along the way, we prove an interesting and fundamental structural result on the nature of subspaces of an m-dimensional vector space V with respect to a normalized measure of entropy. Here, each element in V corresponds to a distinct linear combination of a set {Xi}im=1 of m random variables whose joint probability distribution function is given.
Resumo:
Erasure codes are an efficient means of storing data across a network in comparison to data replication, as they tend to reduce the amount of data stored in the network and offer increased resilience in the presence of node failures. The codes perform poorly though, when repair of a failed node is called for, as they typically require the entire file to be downloaded to repair a failed node. A new class of erasure codes, termed as regenerating codes were recently introduced, that do much better in this respect. However, given the variety of efficient erasure codes available in the literature, there is considerable interest in the construction of coding schemes that would enable traditional erasure codes to be used, while retaining the feature that only a fraction of the data need be downloaded for node repair. In this paper, we present a simple, yet powerful, framework that does precisely this. Under this framework, the nodes are partitioned into two types and encoded using two codes in a manner that reduces the problem of node-repair to that of erasure-decoding of the constituent codes. Depending upon the choice of the two codes, the framework can be used to avail one or more of the following advantages: simultaneous minimization of storage space and repair-bandwidth, low complexity of operation, fewer disk reads at helper nodes during repair, and error detection and correction.
Resumo:
Recently, Guo and Xia introduced low complexity decoders called Partial Interference Cancellation (PIC) and PIC with Successive Interference Cancellation (PIC-SIC), which include the Zero Forcing (ZF) and ZF-SIC receivers as special cases, for point-to-point MIMO channels. In this paper, we show that PIC and PIC-SIC decoders are capable of achieving the full cooperative diversity available in wireless relay networks. We give sufficient conditions for a Distributed Space-Time Block Code (DSTBC) to achieve full diversity with PIC and PIC-SIC decoders and construct a new class of DSTBCs with low complexity full-diversity PIC-SIC decoding using complex orthogonal designs. The new class of codes includes a number of known full-diversity PIC/PIC-SIC decodable Space-Time Block Codes (STBCs) constructed for point-to-point channels as special cases. The proposed DSTBCs achieve higher rates (in complex symbols per channel use) than the multigroup ML decodable DSTBCs available in the literature. Simulation results show that the proposed codes have better bit error rate performance than the best known low complexity, full-diversity DSTBCs.
Resumo:
Erasure codes are an efficient means of storing data across a network in comparison to data replication, as they tend to reduce the amount of data stored in the network and offer increased resilience in the presence of node failures. The codes perform poorly though, when repair of a failed node is called for, as they typically require the entire file to be downloaded to repair a failed node. A new class of erasure codes, termed as regenerating codes were recently introduced, that do much better in this respect. However, given the variety of efficient erasure codes available in the literature, there is considerable interest in the construction of coding schemes that would enable traditional erasure codes to be used, while retaining the feature that only a fraction of the data need be downloaded for node repair. In this paper, we present a simple, yet powerful, framework that does precisely this. Under this framework, the nodes are partitioned into two types and encoded using two codes in a manner that reduces the problem of node-repair to that of erasure-decoding of the constituent codes. Depending upon the choice of the two codes, the framework can be used to avail one or more of the following advantages: simultaneous minimization of storage space and repair-bandwidth, low complexity of operation, fewer disk reads at helper nodes during repair, and error detection and correction.
Resumo:
We consider cooperative spectrum sensing for cognitive radios. We develop an energy efficient detector with low detection delay using sequential hypothesis testing. Sequential Probability Ratio Test (SPRT) is used at both the local nodes and the fusion center. We also analyse the performance of this algorithm and compare with the simulations. Modelling uncertainties in the distribution parameters are considered. Slow fading with and without perfect channel state information at the cognitive radios is taken into account.
Resumo:
This paper considers cooperative spectrum sensing in Cognitive Radios. In our previous work we have developed DualSPRT, a distributed algorithm for cooperative spectrum sensing using Sequential Probability Ratio Test (SPRT) at the Cognitive Radios as well as at the fusion center. This algorithm works well, but is not optimal. In this paper we propose an improved algorithm- SPRT-CSPRT, which is motivated from Cumulative Sum Procedures (CUSUM). We analyse it theoretically. We also modify this algorithm to handle uncertainties in SNR's and fading.
Resumo:
We consider cooperative spectrum sensing for cognitive radios. We develop an energy efficient detector with low detection delay using sequential hypothesis testing. Sequential Probability Ratio Test (SPRT) is used at both the local nodes and the fusion center. We also analyse the performance of this algorithm and compare with the simulations. Modelling uncertainties in the distribution parameters are considered. Slow fading with and without perfect channel state information at the cognitive radios is taken into account.
Resumo:
Density functional theory (DFT) calculations are being performed to investigate the geometric, vibrational, and electronic properties of the chlorogenic acid isomer 3-CQA (1R,3R,4S,5R)-3-{(2E)-3-(3,4-dihydroxyphenyl)prop-2-enoyl]oxy}-1,4, 5-trihydroxycyclohexanecarboxylic acid), a major phenolic compound in coffee. DFT calculations with the 6-311G(d,p) basis set produce very good results. The electrostatic potential mapped onto an isodensity surface has been obtained. A natural bond orbital analysis (NBO) has been performed in order to study intramolecular bonding, interactions among bonds, and delocalization of unpaired electrons. HOMO-LUMO studies give insights into the interaction of the molecule with other species. The calculated HOMO and LUMO energies indicate that a charge transfer occurs within the molecule. (C) 2012 Elsevier B.V. All rights reserved.