3 resultados para 080401 Coding and Information Theory
em Duke University
Resumo:
Backscatter communication is an emerging wireless technology that recently has gained an increase in attention from both academic and industry circles. The key innovation of the technology is the ability of ultra-low power devices to utilize nearby existing radio signals to communicate. As there is no need to generate their own energetic radio signal, the devices can benefit from a simple design, are very inexpensive and are extremely energy efficient compared with traditional wireless communication. These benefits have made backscatter communication a desirable candidate for distributed wireless sensor network applications with energy constraints.
The backscatter channel presents a unique set of challenges. Unlike a conventional one-way communication (in which the information source is also the energy source), the backscatter channel experiences strong self-interference and spread Doppler clutter that mask the information-bearing (modulated) signal scattered from the device. Both of these sources of interference arise from the scattering of the transmitted signal off of objects, both stationary and moving, in the environment. Additionally, the measurement of the location of the backscatter device is negatively affected by both the clutter and the modulation of the signal return.
This work proposes a channel coding framework for the backscatter channel consisting of a bi-static transmitter/receiver pair and a quasi-cooperative transponder. It proposes to use run-length limited coding to mitigate the background self-interference and spread-Doppler clutter with only a small decrease in communication rate. The proposed method applies to both binary phase-shift keying (BPSK) and quadrature-amplitude modulation (QAM) scheme and provides an increase in rate by up to a factor of two compared with previous methods.
Additionally, this work analyzes the use of frequency modulation and bi-phase waveform coding for the transmitted (interrogating) waveform for high precision range estimation of the transponder location. Compared to previous methods, optimal lower range sidelobes are achieved. Moreover, since both the transmitted (interrogating) waveform coding and transponder communication coding result in instantaneous phase modulation of the signal, cross-interference between localization and communication tasks exists. Phase discriminating algorithm is proposed to make it possible to separate the waveform coding from the communication coding, upon reception, and achieve localization with increased signal energy by up to 3 dB compared with previous reported results.
The joint communication-localization framework also enables a low-complexity receiver design because the same radio is used both for localization and communication.
Simulations comparing the performance of different codes corroborate the theoretical results and offer possible trade-off between information rate and clutter mitigation as well as a trade-off between choice of waveform-channel coding pairs. Experimental results from a brass-board microwave system in an indoor environment are also presented and discussed.
Resumo:
There are many sociopolitical theories to help explain why governments and actors do what they do. Securitization Theory is a process-oriented theory in international relations that focuses on how an actor defines another actor as an “existential threat,” and the resulting responses that can be taken in order to address that threat. While Securitization Theory is an acceptable method to analyze the relationships between actors in the international system, this thesis contends that the proper examination is multi-factorial, focusing on the addition of Role Theory to the analysis. Consideration of Role Theory, which is another international relations theory that explains how an actor’s strategies, relationships, and perceptions by others is based on pre-conceptualized definitions of that actor’s identity, is essential in order to fully explain why an actor might respond to another in a particular way. Certain roles an actor may enact produce a rival relationship with other actors in the system, and it is those rival roles that elicit securitized responses. The possibility of a securitized response lessens when a role or a relationship between roles becomes ambiguous. There are clear points of role rivalry and role ambiguity between Hizb’allah and Iran, which has directly impacted, and continues to impact, how the United States (US) responds to these actors. Because of role ambiguity, the US has still not conceptualized an effective way to deal with Hizb’allah and Iran holistically across all its various areas of operation and in its various enacted roles. It would be overly simplistic to see Hizb’allah and Iran solely through one lens depending on which hemisphere or continent one is observing. The reality is likely more nuanced. Both Role Theory and Securitization theory can help to understand and articulate those nuances. By examining two case studies of Hizb’allah and Iran’s enactment of various roles in both the Middle East and Latin America, the situations where roles cause a securitized response and where the response is less securitized due to role ambiguity will become clear. Using this augmented approach of combining both theories, along with supplementing the manner in which an actor, action, or role is analyzed, will produce better methods for policy-making that will be able to address the more ambiguous activities of Hizb’allah and Iran in these two regions.
Resumo:
Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.
Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.