205 resultados para Computational Complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complexity theory is an important and growing area in computer science that has caught the imagination of many researchers in mathematics, physics and biology. In order to reach out to a large section of scientists and engineers, the paper introduces elementary concepts in complexity theory in a informal manner, motivating the reader with many examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space view-point is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces f(s) and f(g) and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating f(s) and f(g) is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication-complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. Extensions to the multi-party case is straightforward and is briefly discussed. The average case CC of the relevant greaterthan (CT) function is characterized within two bits. Under the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm. 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel hypothesis on the function of massive feedback pathways in mammalian visual systems. We propose that the cortical feature detectors compete not for the right to represent the output at a point, but for exclusive rights to abstract and represent part of the underlying input. Feedback can do this very naturally. A computational model that implements the above idea for the problem of line detection is presented and based on that we suggest a functional role for the thalamo-cortical loop during perception of lines. We show that the model successfully tackles the so called Cross problem. Based on some recent experimental results, we discuss the biological plausibility of our model. We also comment on the relevance of our hypothesis (on the role of feedback) to general sensory information processing and recognition. (C) 1998 Published by Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper looks at the complexity of four different incremental problems. The following are the problems considered: (1) Interval partitioning of a flow graph (2) Breadth first search (BFS) of a directed graph (3) Lexicographic depth first search (DFS) of a directed graph (4) Constructing the postorder listing of the nodes of a binary tree. The last problem arises out of the need for incrementally computing the Sethi-Ullman (SU) ordering [1] of the subtrees of a tree after it has undergone changes of a given type. These problems are among those that claimed our attention in the process of our designing algorithmic techniques for incremental code generation. BFS and DFS have certainly numerous other applications, but as far as our work is concerned, incremental code generation is the common thread linking these problems. The study of the complexity of these problems is done from two different perspectives. In [2] is given the theory of incremental relative lower bounds (IRLB). We use this theory to derive the IRLBs of the first three problems. Then we use the notion of a bounded incremental algorithm [4] to prove the unboundedness of the fourth problem with respect to the locally persistent model of computation. Possibly, the lower bound result for lexicographic DFS is the most interesting. In [5] the author considers lexicographic DFS to be a problem for which the incremental version may require the recomputation of the entire solution from scratch. In that sense, our IRLB result provides further evidence for this possibility with the proviso that the incremental DFS algorithms considered be ones that do not require too much of preprocessing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, Guo and Xia gave sufficient conditions for an STBC to achieve full diversity when a PIC (Partial Interference Cancellation) or a PIC-SIC (PIC with Successive Interference Cancellation) decoder is used at the receiver. In this paper, we give alternative conditions for an STBC to achieve full diversity with PIC and PIC-SIC decoders, which are equivalent to Guo and Xia's conditions, but are much easier to check. Using these conditions, we construct a new class of full diversity PIC-SIC decodable codes, which contain the Toeplitz codes and a family of codes recently proposed by Zhang, Xu et. al. as proper subclasses. With the help of the new criteria, we also show that a class of PIC-SIC decodable codes recently proposed by Zhang, Shi et. al. can be decoded with much lower complexity than what is reported, without compromising on full diversity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Zero padded systems with linear receivers are shown to be robust and amenable to fast implementations in single antenna scenarios. In this paper, properties of such systems are investigated when multiple antennas are present at both ends of the communication link. In particular, their diversity and complexity are evaluated for precoded transmissions. The linear receivers are shown to exploit multipath and receive diversities, even in the absence of any coding at the transmitter. Use of additional redundancy to improve performance is considered and the effect of transmission rate on diversity order is analyzed. Low complexity implementations of Zero Forcing receivers are devised to further enhance their applicability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a time division duplex multiple-input multiple-output (nt × nr MIMO). Using channel state information (CSI) at the transmitter, singular value decomposition (SVD) of the channel matrix is performed. This transforms the MIMO channel into parallel subchannels, but has a low overall diversity order. Hence, we propose X-Codes which achieve a higher diversity order by pairing the subchannels, prior to SVD preceding. In particular, each pair of information symbols is encoded by a fixed 2 × 2 real rotation matrix. X-Codes can be decoded using nr very low complexity two-dimensional real sphere decoders. Error probability analysis for X-Codes enables us to choose the optimal pairing and the optimal rotation angle for each pair. Finally, we show that our new scheme outperforms other low complexity precoding schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient strategy for identification of delamination in composite beams and connected structures is presented. A spectral finite-element model consisting of a damaged spectral element is used for model-based prediction of the damaged structural response in the frequency domain. A genetic algorithm (GA) specially tailored for damage identification is derived and is integrated with finite-element code for automation. For best application of the GA, sensitivities of various objective functions with respect to delamination parameters are studied and important conclusions are presented. Model-based simulations of increasing complexity illustrate some of the attractive features of the strategy in terms of accuracy as well as computational cost. This shows the possibility of using such strategies for the development of smart structural health monitoring softwares and systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A three- dimensional, transient model is developed for studying heat transfer, fluid flow, and mass transfer for the case of a single- pass laser surface alloying process. The coupled momentum, energy, and species conservation equations are solved using a finite volume procedure. Phase change processes are modeled using a fixed-grid enthalpy-porosity technique, which is capable of predicting the continuously evolving solid- liquid interface. The three- dimensional model is able to predict the species concentration distribution inside the molten pool during alloying, as well as in the entire cross section of the solidified alloy. The model is simulated for different values of various significant processing parameters such as laser power, scanning speed, and powder feedrate in order to assess their influences on geometry and dynamics of the pool, cooling rates, as well as species concentration distribution inside the substrate. Effects of incorporating property variations in the numerical model are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural and electronic properties of C-H center dot center dot center dot O contacts in compounds containing a formyl group are investigated from the perspective of both hydrogen bonding and dipole-dipole interactions, in a systematic and graded approach. The effects of a-substitution and self-association on the nature of the formyl H-atom are studied with the NBO and AIM methodologies. The relative dipole-dipole contributions in formyl C-H center dot center dot center dot O interactions are obtained for aldehyde dimers. The stabilities and energies of aldehyde clusters (dimer through octamer) have been examined computationally. Such studies have an implication in crystallization mechanisms. Experimental X-ray crystal structures of formaldehyde, acrolein and N-methylformamide have been determined in order to ascertain the role of C-H center dot center dot center dot O interactions in the crystal packing of formyl compounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Very Long Instruction Word (VLIW) architectures exploit instruction level parallelism (ILP) with the help of the compiler to achieve higher instruction throughput with minimal hardware. However, control and data dependencies between operations limit the available ILP, which not only hinders the scalability of VLIW architectures, but also result in code size expansion. Although speculation and predicated execution mitigate ILP limitations due to control dependencies to a certain extent, they increase hardware cost and exacerbate code size expansion. Simultaneous multistreaming (SMS) can significantly improve operation throughput by allowing interleaved execution of operations from multiple instruction streams. In this paper we study SMS for VLIW architectures and quantify the benefits associated with it using a case study of the MPEG-2 video decoder. We also propose the notion of virtual resources for VLIW architectures, which decouple architectural resources (resources exposed to the compiler) from the microarchitectural resources, to limit code size expansion. Our results for a VLIW architecture demonstrate that: (1) SMS delivers much higher throughput than that achieved by speculation and predicated execution, (2) the increase in performance due to the addition of speculation and predicated execution support over SMS averages around 12%. The minor increase in performance might not warrant the additional hardware complexity involved, and (3) the notion of virtual resources is very effective in reducing no-operations (NOPs) and consequently reduce code size with little or no impact on performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a low-ML-decoding-complexity, full-rate, full-diversity space-time block code (STBC) for a 2 transmit antenna, 2 receive antenna multiple-input multipleoutput (MIMO) system, with coding gain equal to that of the best and well known Golden code for any QAM constellation.Recently, two codes have been proposed (by Paredes, Gershman and Alkhansari and by Sezginer and Sari), which enjoy a lower decoding complexity relative to the Golden code, but have lesser coding gain. The 2 × 2 STBC presented in this paper has lesser decoding complexity for non-square QAM constellations,compared with that of the Golden code, while having the same decoding complexity for square QAM constellations. Compared with the Paredes-Gershman-Alkhansari and Sezginer-Sari codes, the proposed code has the same decoding complexity for nonrectangular QAM constellations. Simulation results, which compare the codeword error rate (CER) performance, are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a Gaussian mixture model (GMM) based vector quantization (VQ) method for coding wideband speech line spectrum frequency (LSF) parameters at low complexity. The PDF of LSF source vector is modeled using the Gaussian mixture (GM) density with higher number of uncorrelated Gaussian mixtures and an optimum scalar quantizer (SQ) is designed for each Gaussian mixture. The reduction of quantization complexity is achieved using the relevant subset of available optimum SQs. For an input vector, the subset of quantizers is chosen using nearest neighbor criteria. The developed method is compared with the recent VQ methods and shown to provide high quality rate-distortion (R/D) performance at lower complexity. In addition, the developed method also provides the advantages of bitrate scalability and rate-independent complexity.