944 resultados para Complexity theory
Resumo:
Communication complexity refers to the minimum rate of public communication required for generating a maximal-rate secret key (SK) in the multiterminal source model of Csiszar and Narayan. Tyagi recently characterized this communication complexity for a two-terminal system. We extend the ideas in Tyagi's work to derive a lower bound on communication complexity in the general multiterminal setting. In the important special case of the complete graph pairwise independent network (PIN) model, our bound allows us to determine the exact linear communication complexity, i.e., the communication complexity when the communication and SK are restricted to be linear functions of the randomness available at the terminals.
Resumo:
We consider optimal power allocation policies for a single server, multiuser system. The power is consumed in transmission of data only. The transmission channel may experience multipath fading. We obtain very efficient, low computational complexity algorithms which minimize power and ensure stability of the data queues. We also obtain policies when the users may have mean delay constraints. If the power required is a linear function of rate then we exploit linearity and obtain linear programs with low complexity.
Resumo:
We present a framework for obtaining reliable solid-state charge and optical excitations and spectra from optimally tuned range-separated hybrid density functional theory. The approach, which is fully couched within the formal framework of generalized Kohn-Sham theory, allows for the accurate prediction of exciton binding energies. We demonstrate our approach through first principles calculations of one- and two-particle excitations in pentacene, a molecular semiconducting crystal, where our work is in excellent agreement with experiments and prior computations. We further show that with one adjustable parameter, set to produce the known band gap, this method accurately predicts band structures and optical spectra of silicon and lithium fluoride, prototypical covalent and ionic solids. Our findings indicate that for a broad range of extended bulk systems, this method may provide a computationally inexpensive alternative to many-body perturbation theory, opening the door to studies of materials of increasing size and complexity.
Resumo:
In the POSSIBLE WINNER problem in computational social choice theory, we are given a set of partial preferences and the question is whether a distinguished candidate could be made winner by extending the partial preferences to linear preferences. Previous work has provided, for many common voting rules, fixed parameter tractable algorithms for the POSSIBLE WINNER problem, with number of candidates as the parameter. However, the corresponding kernelization question is still open and in fact, has been mentioned as a key research challenge 10]. In this paper, we settle this open question for many common voting rules. We show that the POSSIBLE WINNER problem for maximin, Copeland, Bucklin, ranked pairs, and a class of scoring rules that includes the Borda voting rule does not admit a polynomial kernel with the number of candidates as the parameter. We show however that the COALITIONAL MANIPULATION problem which is an important special case of the POSSIBLE WINNER problem does admit a polynomial kernel for maximin, Copeland, ranked pairs, and a class of scoring rules that includes the Borda voting rule, when the number of manipulators is polynomial in the number of candidates. A significant conclusion of our work is that the POSSIBLE WINNER problem is harder than the COALITIONAL MANIPULATION problem since the COALITIONAL MANIPULATION problem admits a polynomial kernel whereas the POSSIBLE WINNER problem does not admit a polynomial kernel. (C) 2015 Elsevier B.V. All rights reserved.
Resumo:
In recent years coastal resource management has begun to stand as its own discipline. Its multidisciplinary nature gives it access to theory situated in each of the diverse fields which it may encompass, yet management practices often revert to the primary field of the manager. There is a lack of a common set of “coastal” theory from which managers can draw. Seven resource-related issues with which coastal area managers must contend include: coastal habitat conservation, traditional maritime communities and economies, strong development and use pressures, adaptation to sea level rise and climate change, landscape sustainability and resilience, coastal hazards, and emerging energy technologies. The complexity and range of human and environmental interactions at the coast suggest a strong need for a common body of coastal management theory which managers would do well to understand generally. Planning theory, which itself is a synthesis of concepts from multiple fields, contains ideas generally valuable to coastal management. Planning theory can not only provide an example of how to develop a multi- or transdisciplinary set of theory, but may also provide actual theoretical foundation for a coastal theory. In particular we discuss five concepts in the planning theory discourse and present their utility for coastal resource managers. These include “wicked” problems, ecological planning, the epistemology of knowledge communities, the role of the planner/ manager, and collaborative planning. While these theories are known and familiar to some professionals working at the coast, we argue that there is a need for broader understanding amongst the various specialists working in the increasingly identifiable field of coastal resource management. (PDF contains 4 pages)
Resumo:
Signal processing techniques play important roles in the design of digital communication systems. These include information manipulation, transmitter signal processing, channel estimation, channel equalization and receiver signal processing. By interacting with communication theory and system implementing technologies, signal processing specialists develop efficient schemes for various communication problems by wisely exploiting various mathematical tools such as analysis, probability theory, matrix theory, optimization theory, and many others. In recent years, researchers realized that multiple-input multiple-output (MIMO) channel models are applicable to a wide range of different physical communications channels. Using the elegant matrix-vector notations, many MIMO transceiver (including the precoder and equalizer) design problems can be solved by matrix and optimization theory. Furthermore, the researchers showed that the majorization theory and matrix decompositions, such as singular value decomposition (SVD), geometric mean decomposition (GMD) and generalized triangular decomposition (GTD), provide unified frameworks for solving many of the point-to-point MIMO transceiver design problems.
In this thesis, we consider the transceiver design problems for linear time invariant (LTI) flat MIMO channels, linear time-varying narrowband MIMO channels, flat MIMO broadcast channels, and doubly selective scalar channels. Additionally, the channel estimation problem is also considered. The main contributions of this dissertation are the development of new matrix decompositions, and the uses of the matrix decompositions and majorization theory toward the practical transmit-receive scheme designs for transceiver optimization problems. Elegant solutions are obtained, novel transceiver structures are developed, ingenious algorithms are proposed, and performance analyses are derived.
The first part of the thesis focuses on transceiver design with LTI flat MIMO channels. We propose a novel matrix decomposition which decomposes a complex matrix as a product of several sets of semi-unitary matrices and upper triangular matrices in an iterative manner. The complexity of the new decomposition, generalized geometric mean decomposition (GGMD), is always less than or equal to that of geometric mean decomposition (GMD). The optimal GGMD parameters which yield the minimal complexity are derived. Based on the channel state information (CSI) at both the transmitter (CSIT) and receiver (CSIR), GGMD is used to design a butterfly structured decision feedback equalizer (DFE) MIMO transceiver which achieves the minimum average mean square error (MSE) under the total transmit power constraint. A novel iterative receiving detection algorithm for the specific receiver is also proposed. For the application to cyclic prefix (CP) systems in which the SVD of the equivalent channel matrix can be easily computed, the proposed GGMD transceiver has K/log_2(K) times complexity advantage over the GMD transceiver, where K is the number of data symbols per data block and is a power of 2. The performance analysis shows that the GGMD DFE transceiver can convert a MIMO channel into a set of parallel subchannels with the same bias and signal to interference plus noise ratios (SINRs). Hence, the average bit rate error (BER) is automatically minimized without the need for bit allocation. Moreover, the proposed transceiver can achieve the channel capacity simply by applying independent scalar Gaussian codes of the same rate at subchannels.
In the second part of the thesis, we focus on MIMO transceiver design for slowly time-varying MIMO channels with zero-forcing or MMSE criterion. Even though the GGMD/GMD DFE transceivers work for slowly time-varying MIMO channels by exploiting the instantaneous CSI at both ends, their performance is by no means optimal since the temporal diversity of the time-varying channels is not exploited. Based on the GTD, we develop space-time GTD (ST-GTD) for the decomposition of linear time-varying flat MIMO channels. Under the assumption that CSIT, CSIR and channel prediction are available, by using the proposed ST-GTD, we develop space-time geometric mean decomposition (ST-GMD) DFE transceivers under the zero-forcing or MMSE criterion. Under perfect channel prediction, the new system minimizes both the average MSE at the detector in each space-time (ST) block (which consists of several coherence blocks), and the average per ST-block BER in the moderate high SNR region. Moreover, the ST-GMD DFE transceiver designed under an MMSE criterion maximizes Gaussian mutual information over the equivalent channel seen by each ST-block. In general, the newly proposed transceivers perform better than the GGMD-based systems since the super-imposed temporal precoder is able to exploit the temporal diversity of time-varying channels. For practical applications, a novel ST-GTD based system which does not require channel prediction but shares the same asymptotic BER performance with the ST-GMD DFE transceiver is also proposed.
The third part of the thesis considers two quality of service (QoS) transceiver design problems for flat MIMO broadcast channels. The first one is the power minimization problem (min-power) with a total bitrate constraint and per-stream BER constraints. The second problem is the rate maximization problem (max-rate) with a total transmit power constraint and per-stream BER constraints. Exploiting a particular class of joint triangularization (JT), we are able to jointly optimize the bit allocation and the broadcast DFE transceiver for the min-power and max-rate problems. The resulting optimal designs are called the minimum power JT broadcast DFE transceiver (MPJT) and maximum rate JT broadcast DFE transceiver (MRJT), respectively. In addition to the optimal designs, two suboptimal designs based on QR decomposition are proposed. They are realizable for arbitrary number of users.
Finally, we investigate the design of a discrete Fourier transform (DFT) modulated filterbank transceiver (DFT-FBT) with LTV scalar channels. For both cases with known LTV channels and unknown wide sense stationary uncorrelated scattering (WSSUS) statistical channels, we show how to optimize the transmitting and receiving prototypes of a DFT-FBT such that the SINR at the receiver is maximized. Also, a novel pilot-aided subspace channel estimation algorithm is proposed for the orthogonal frequency division multiplexing (OFDM) systems with quasi-stationary multi-path Rayleigh fading channels. Using the concept of a difference co-array, the new technique can construct M^2 co-pilots from M physical pilot tones with alternating pilot placement. Subspace methods, such as MUSIC and ESPRIT, can be used to estimate the multipath delays and the number of identifiable paths is up to O(M^2), theoretically. With the delay information, a MMSE estimator for frequency response is derived. It is shown through simulations that the proposed method outperforms the conventional subspace channel estimator when the number of multipaths is greater than or equal to the number of physical pilots minus one.
Resumo:
This report is an introduction to the concept of treewidth, a property of graphs that has important implications in algorithms. Some basic concepts of graph theory are presented in the first chapter for those readers that are not familiar with the notation. In Chapter 2, the definition of treewidth and some different ways of characterizing it are explained. The last two chapters focus on the algorithmic implications of treewidth, which are very relevant in Computer Science. An algorithm to compute the treewidth of a graph is presented and its result can be later applied to many other problems in graph theory, like those introduced in the last chapter.
Resumo:
This paper proposes a method for analysing the operational complexity in supply chains by using an entropic measure based on information theory. The proposed approach estimates the operational complexity at each stage of the supply chain and analyses the changes between stages. In this paper a stage is identified by the exchange of data and/or material. Through analysis the method identifies the stages where the operational complexity is both generated and propagated (exported, imported, generated or absorbed). Central to the method is the identification of a reference point within the supply chain. This is where the operational complexity is at a local minimum along the data transfer stages. Such a point can be thought of as a 'sink' for turbulence generated in the supply chain. Where it exists, it has the merit of stabilising the supply chain by attenuating uncertainty. However, the location of the reference point is also a matter of choice. If the preferred location is other than the current one, this is a trigger for management action. The analysis can help decide appropriate remedial action. More generally, the approach can assist logistics management by highlighting problem areas. An industrial application is presented to demonstrate the applicability of the method. © 2013 Operational Research Society Ltd. All rights reserved.
Resumo:
Modeling work in neuroscience can be classified using two different criteria. The first one is the complexity of the model, ranging from simplified conceptual models that are amenable to mathematical analysis to detailed models that require simulations in order to understand their properties. The second criterion is that of direction of workflow, which can be from microscopic to macroscopic scales (bottom-up) or from behavioral target functions to properties of components (top-down). We review the interaction of theory and simulation using examples of top-down and bottom-up studies and point to some current developments in the fields of computational and theoretical neuroscience.
Resumo:
Kurki, M. (2006). Causes of a Divided Discipline: Rethinking the Concept of Cause in International Relations theory. Review of International Studies, 32 (2), 189-216. RAE2008
Resumo:
This Thesis is an exploration of potential enhancement in effectiveness, personally, professionally and organisationally through the use of Theory as an Apparatus of Thought. Enhanced effectiveness was sought by the practitioner (Subject), while in transition to becoming Chief Executive of his organization. The introduction outlines the content and the structure of the University College Cork DBA. Essay One outlines what Theory is, what Adult Mental Development is and an exploration of Theories held in the Authors past professional practice. Immunity to change is also reflected on. Essay Two looks at the construct of the key Theories used in the Thesis. Prof. Robert Kegan’s Theory of Adult Mental Development was used to aid the generation of insight. The other key Theories used were The Theory of The Business, Theory of the Co‐operative and a Theory of Organisational Leadership. Essay Three explores the application of the key Theories in a professional setting. The findings of the Thesis were that the subject was capable of dealing with increased environmental complexity and uncertainty by using Theory as an Apparatus of Thought, which in turn enhanced personal, professional and organisational effectiveness. This was achieved by becoming more aware of the Theories held by the practitioner, the experiences from the application of those Theories, which then led to greater insight. The author also found that a detailed understanding of the Theory of the Business and a Theory of Leadership would support any new CEO in the challenging early part of their tenure.
Resumo:
Whether a small cell, a small genome or a minimal set of chemical reactions with self-replicating properties, simplicity is beguiling. As Leonardo da Vinci reportedly said, 'simplicity is the ultimate sophistication'. Two diverging views of simplicity have emerged in accounts of symbiotic and commensal bacteria and cosmopolitan free-living bacteria with small genomes. The small genomes of obligate insect endosymbionts have been attributed to genetic drift caused by small effective population sizes (Ne). In contrast, streamlining theory attributes small cells and genomes to selection for efficient use of nutrients in populations where Ne is large and nutrients limit growth. Regardless of the cause of genome reduction, lost coding potential eventually dictates loss of function. Consequences of reductive evolution in streamlined organisms include atypical patterns of prototrophy and the absence of common regulatory systems, which have been linked to difficulty in culturing these cells. Recent evidence from metagenomics suggests that streamlining is commonplace, may broadly explain the phenomenon of the uncultured microbial majority, and might also explain the highly interdependent (connected) behavior of many microbial ecosystems. Streamlining theory is belied by the observation that many successful bacteria are large cells with complex genomes. To fully appreciate streamlining, we must look to the life histories and adaptive strategies of cells, which impose minimum requirements for complexity that vary with niche.