958 resultados para memory access complexity
Resumo:
"Extended Clifford algebras" are introduced as a means to obtain low ML decoding complexity space-time block codes. Using left regular matrix representations of two specific classes of extended Clifford algebras, two systematic algebraic constructions of full diversity Distributed Space-Time Codes (DSTCs) are provided for any power of two number of relays. The left regular matrix representation has been shown to naturally result in space-time codes meeting the additional constraints required for DSTCs. The DSTCs so constructed have the salient feature of reduced Maximum Likelihood (ML) decoding complexity. In particular, the ML decoding of these codes can be performed by applying the lattice decoder algorithm on a lattice of four times lesser dimension than what is required in general. Moreover these codes have a uniform distribution of power among the relays and in time, thus leading to a low Peak to Average Power Ratio at the relays.
Resumo:
People with disabilities (PWD) experience difficulties in accessing the transport system (including both infrastructure and services) to meet their needs for health care, employment and other activities. Our research shows that lack of access to the journeys needed for these purposes is a more significant barrier in low and middle income countries than in high income countries, and results in inadequate health care, rehabilitation and access to education and employment. At the same time, the existing transport system in low and middle income countries presents much higher road crash risks than in high income countries. By combining the principles and methods of Road Safety Audit and disability access, and adapting these Western approaches to a low/middle income country context, we have worked with Handicap International Cambodia to develop a Journey Access Tool (JAT) for use by disabled peoples’ organisations (DPOs), people with a disability and other key stakeholders. A key element of the approach is that it involves the participation of PWD on the journeys that they need to take, and it identifies infrastructure and service improvements that should be prioritised in order to facilitate access to these journeys. The JAT has been piloted in Cambodia with a range of PWD. This presentation will outline the design of the JAT and the results of the pilot studies. The information gained thus far strongly suggests that the JAT is a valuable and cost-effective approach that can be used by DPOs and professionals to identify barriers to access and prioritise the steps needed to address them.
Resumo:
We consider the problem of transmission of several discrete sources over a multiple access channel (MAC) with side information at the sources and the decoder. Source-channel separation does not hold for this channel. Sufficient conditions are provided for transmission of sources with a given distortion. The channel could have continuous alphabets (Gaussian MAC is a special case). Various previous results are obtained as special cases.
Resumo:
Review of Memory and Gender in Medieval Europe, 900-1200 by Elizabeth van Houts (Toronto UP, 1999).
Resumo:
Richard Lewontin proposed that the ability of a scientific field to create a narrative for public understanding garners it social relevance. This article applies Lewontin's conceptual framework of the functions of science (manipulatory and explanatory) to compare and explain the current differences in perceived societal relevance of genetics/genomics and proteomics. We provide three examples to illustrate the social relevance and strong cultural narrative of genetics/genomics for which no counterpart exists for proteomics. We argue that the major difference between genetics/genomics and proteomics is that genomics has a strong explanatory function, due to the strong cultural narrative of heredity. Based on qualitative interviews and observations of proteomics conferences, we suggest that the nature of proteins, lack of public understanding, and theoretical complexity exacerbates this difference for proteomics. Lewontin's framework suggests that social scientists may find that omics sciences affect social relations in different ways than past analyses of genetics.
Resumo:
In prediction phase, the hierarchical tree structure obtained from the test image is used to predict every central pixel of an image by its four neighboring pixels. The prediction scheme generates the predicted error image, to which the wavelet/sub-band coding algorithm can be applied to obtain efficient compression. In quantization phase, we used a modified SPIHT algorithm to achieve efficiency in memory requirements. The memory constraint plays a vital role in wireless and bandwidth-limited applications. A single reusable list is used instead of three continuously growing linked lists as in case of SPIHT. This method is error resilient. The performance is measured in terms of PSNR and memory requirements. The algorithm shows good compression performance and significant savings in memory. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space viewpoint is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces $\mathcal{S_I}$ and $\mathcal{S_C}$ and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating $\mathcal{S_I}$ and $\mathcal{S_C}$ is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. The average case CC of the relevant greater-than (GT) function is characterized within two bits. In the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm.
Resumo:
The traditional 'publish for free and pay to read' business model adopted by publishers of academic journals can lead to disparity in access to scholarly literature, exacerbated by rising journal costs and shrinking library budgets. However, although the 'pay to publish and read for free' business model of open-access publishing has helped to create a level playing field for readers, it does more harm than good in the developing world.
Resumo:
In this paper we first present the 'wet N2O' furnace oxidation process to grow nitrided tunnel oxides in the thickness range 6 to 8 nm on silicon at a temperature of 800 degrees C. Electrical characteristics of MOS capacitors and MOSFETs fabricated using this oxide as gate oxide have been evaluated and the superior features of this oxide are ascertained The frequency response of the interface states, before and after subjecting the MOSFET gate oxide to constant current stress, is studied using a simple analytical model developed in this work.
Resumo:
The differential encoding/decoding setup introduced by Kiran et at, Oggier et al and Jing et al for wireless relay networks that use codebooks consisting of unitary matrices is extended to allow codebooks consisting of scaled unitary matrices. For such codebooks to be used in the Jing-Hassibi protocol for cooperative diversity, the conditions that need to be satisfied by the relay matrices and the codebook are identified. A class of previously known rate one, full diversity, four-group encodable and four-group decodable Differential Space-Time Codes (DSTCs) is proposed for use as Distributed DSTCs (DDSTCs) in the proposed set up. To the best of our knowledge, this is the first known low decoding complexity DDSTC scheme for cooperative wireless networks.
Resumo:
Understanding the molecular mechanisms of immunological memory assumes importance in vaccine design. We had earlier hypothesized a mechanism for the maintenance of immunological memory through the operation of a network of idiotypic and anti-idiotypic antibodies (Ab2). Peptides derived from an internal image carrying anti-idiotypic antibody are hypothesized to facilitate the perpetuation of antigen specific T cell memory through similarity in peptide-MHC binding as that of the antigenic peptide. In the present work, the existence of such peptidomimics of the antigen in the Ab2 variable region and their similarity of MHC-I binding was examined by bioinformatics approaches. The analysis employing three known viral antigens and one tumor-associated antigen shows that peptidomimics from Ab2 variable regions have structurally similar MHC-I binding patterns as compared to antigenic peptides, indicating a structural basis for memory perpetuation. (C)) 2007 Elsevier Inc. All rights reserved.
Resumo:
We consider a scenario in which a wireless sensor network is formed by randomly deploying n sensors to measure some spatial function over a field, with the objective of computing a function of the measurements and communicating it to an operator station. We restrict ourselves to the class of type-threshold functions (as defined in the work of Giridhar and Kumar, 2005), of which max, min, and indicator functions are important examples: our discussions are couched in terms of the max function. We view the problem as one of message-passing distributed computation over a geometric random graph. The network is assumed to be synchronous, and the sensors synchronously measure values and then collaborate to compute and deliver the function computed with these values to the operator station. Computation algorithms differ in (1) the communication topology assumed and (2) the messages that the nodes need to exchange in order to carry out the computation. The focus of our paper is to establish (in probability) scaling laws for the time and energy complexity of the distributed function computation over random wireless networks, under the assumption of centralized contention-free scheduling of packet transmissions. First, without any constraint on the computation algorithm, we establish scaling laws for the computation time and energy expenditure for one-time maximum computation. We show that for an optimal algorithm, the computation time and energy expenditure scale, respectively, as Theta(radicn/log n) and Theta(n) asymptotically as the number of sensors n rarr infin. Second, we analyze the performance of three specific computation algorithms that may be used in specific practical situations, namely, the tree algorithm, multihop transmission, and the Ripple algorithm (a type of gossip algorithm), and obtain scaling laws for the computation time and energy expenditure as n rarr infin. In particular, we show that the computation time for these algorithms scales as Theta(radicn/lo- g n), Theta(n), and Theta(radicn log n), respectively, whereas the energy expended scales as , Theta(n), Theta(radicn/log n), and Theta(radicn log n), respectively. Finally, simulation results are provided to show that our analysis indeed captures the correct scaling. The simulations also yield estimates of the constant multipliers in the scaling laws. Our analyses throughout assume a centralized optimal scheduler, and hence, our results can be viewed as providing bounds for the performance with practical distributed schedulers.
Resumo:
The problem of designing high rate, full diversity noncoherent space-time block codes (STBCs) with low encoding and decoding complexity is addressed. First, the notion of g-group encodable and g-group decodable linear STBCs is introduced. Then for a known class of rate-1 linear designs, an explicit construction of fully-diverse signal sets that lead to four-group encodable and four-group decodable differential scaled unitary STBCs for any power of two number of antennas is provided. Previous works on differential STBCs either sacrifice decoding complexity for higher rate or sacrifice rate for lower decoding complexity.
Resumo:
A discursive article examining the proposition that provisions for young people in care should be formally extended beyond their 18th birthday when care normally ends. Debate exists as to whether this should include a legislative extension to Care as has happened in some overseas jurisdictions or whether it is more appropriate to simply extend access to dedicated support provisions. The article concludes that both may have a place as long as the complexity of young people's situations is adequately responded to.
Resumo:
It is known that by employing space-time-frequency codes (STFCs) to frequency selective MIMO-OFDM systems, all the three diversity viz spatial, temporal and multipath can be exploited. There exists space-time-frequency block codes (STFBCs) designed using orthogonal designs with constellation precoder to get full diversity (Z.Liu, Y.Xin and G.Giannakis IEEE Trans. Signal Processing, Oct. 2002). Since orthogonal designs of rate one exists only for two transmit antennas, for more than two transmit antennas STFBCs of rate-one and full-diversity cannot be constructed using orthogonal designs. This paper presents a STFBC scheme of rate one for four transmit antennas designed using quasi-orthogonal designs along with co-ordinate interleaved orthogonal designs (Zafar Ali Khan and B. Sundar Rajan Proc: ISIT 2002). Conditions on the signal sets that give full-diversity are identified. Simulation results are presented to show the superiority of our codes over the existing ones.