845 resultados para Middleware for wireless communication
Resumo:
Shipping list no.: 2000-0011-P.
Resumo:
Wireless-communication technology can be used to improve road safety and to provide Internet access inside vehicles. This paper proposes a cross-layer protocol called coordinated external peer communication (CEPEC) for Internet-access services and peer communications for vehicular networks. We assume that IEEE 802.16 base stations (BS) are installed along highways and that the same air interface is equipped in vehicles. Certain vehicles locating outside of the limited coverage of their nearest BSs can still get access to the Internet via a multihop route to their BSs. For Internet-access services, the objective of CEPEC is to increase the end-to-end throughput while providing a fairness guarantee in bandwidth usage among road segments. To achieve this goal, the road is logically partitioned into segments of equal length. A relaying head is selected in each segment that performs both local-packet collecting and aggregated packets relaying. The simulation results have shown that the proposed CEPEC protocol provides higher throughput with guaranteed fairness in multihop data delivery in vehicular networks when compared with the purely IEEE 802.16-based protocol.
Resumo:
Link adaptation is a critical component of IEEE 802.11 systems. In this paper, we analytically model a retransmission based Auto Rate Fallback (ARF) link adaptation algorithm. Both packet collisions and packet corruptions are modeled with the algorithm. The models can provide insights into the dynamics of the link adaptation algorithms and configuration of algorithms parameters. It is also observed that when the competing number of stations is high, packet collisions can largely affected the performance of ARF and make ARF operate with the lowest date rate, even when no packet corruption occur. This is in contrast to the existing assumption that packet collision will not affect the correct operation of ARF and can be ignored in the evaluation of ARF. The work presented in this paper can provide guidelines on configuring the link adaptation algorithms and designing new link adaptation algorithms for future high speed 802.11 systems. © 2006 IEEE.
Resumo:
In recent years, wireless communication infrastructures have been widely deployed for both personal and business applications. IEEE 802.11 series Wireless Local Area Network (WLAN) standards attract lots of attention due to their low cost and high data rate. Wireless ad hoc networks which use IEEE 802.11 standards are one of hot spots of recent network research. Designing appropriate Media Access Control (MAC) layer protocols is one of the key issues for wireless ad hoc networks. ^ Existing wireless applications typically use omni-directional antennas. When using an omni-directional antenna, the gain of the antenna in all directions is the same. Due to the nature of the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 standards, only one of the one-hop neighbors can send data at one time. Nodes other than the sender and the receiver must be either in idle or listening state, otherwise collisions could occur. The downside of the omni-directionality of antennas is that the spatial reuse ratio is low and the capacity of the network is considerably limited. ^ It is therefore obvious that the directional antenna has been introduced to improve spatial reutilization. As we know, a directional antenna has the following benefits. It can improve transport capacity by decreasing interference of a directional main lobe. It can increase coverage range due to a higher SINR (Signal Interference to Noise Ratio), i.e., with the same power consumption, better connectivity can be achieved. And the usage of power can be reduced, i.e., for the same coverage, a transmitter can reduce its power consumption. ^ To utilizing the advantages of directional antennas, we propose a relay-enabled MAC protocol. Two relay nodes are chosen to forward data when the channel condition of direct link from the sender to the receiver is poor. The two relay nodes can transfer data at the same time and a pipelined data transmission can be achieved by using directional antennas. The throughput can be improved significant when introducing the relay-enabled MAC protocol. ^ Besides the strong points, directional antennas also have some explicit drawbacks, such as the hidden terminal and deafness problems and the requirements of retaining location information for each node. Therefore, an omni-directional antenna should be used in some situations. The combination use of omni-directional and directional antennas leads to the problem of configuring heterogeneous antennas, i e., given a network topology and a traffic pattern, we need to find a tradeoff between using omni-directional and using directional antennas to obtain a better network performance over this configuration. ^ Directly and mathematically establishing the relationship between the network performance and the antenna configurations is extremely difficult, if not intractable. Therefore, in this research, we proposed several clustering-based methods to obtain approximate solutions for heterogeneous antennas configuration problem, which can improve network performance significantly. ^ Our proposed methods consist of two steps. The first step (i.e., clustering links) is to cluster the links into different groups based on the matrix-based system model. After being clustered, the links in the same group have similar neighborhood nodes and will use the same type of antenna. The second step (i.e., labeling links) is to decide the type of antenna for each group. For heterogeneous antennas, some groups of links will use directional antenna and others will adopt omni-directional antenna. Experiments are conducted to compare the proposed methods with existing methods. Experimental results demonstrate that our clustering-based methods can improve the network performance significantly. ^
Resumo:
Backscatter communication is an emerging wireless technology that recently has gained an increase in attention from both academic and industry circles. The key innovation of the technology is the ability of ultra-low power devices to utilize nearby existing radio signals to communicate. As there is no need to generate their own energetic radio signal, the devices can benefit from a simple design, are very inexpensive and are extremely energy efficient compared with traditional wireless communication. These benefits have made backscatter communication a desirable candidate for distributed wireless sensor network applications with energy constraints.
The backscatter channel presents a unique set of challenges. Unlike a conventional one-way communication (in which the information source is also the energy source), the backscatter channel experiences strong self-interference and spread Doppler clutter that mask the information-bearing (modulated) signal scattered from the device. Both of these sources of interference arise from the scattering of the transmitted signal off of objects, both stationary and moving, in the environment. Additionally, the measurement of the location of the backscatter device is negatively affected by both the clutter and the modulation of the signal return.
This work proposes a channel coding framework for the backscatter channel consisting of a bi-static transmitter/receiver pair and a quasi-cooperative transponder. It proposes to use run-length limited coding to mitigate the background self-interference and spread-Doppler clutter with only a small decrease in communication rate. The proposed method applies to both binary phase-shift keying (BPSK) and quadrature-amplitude modulation (QAM) scheme and provides an increase in rate by up to a factor of two compared with previous methods.
Additionally, this work analyzes the use of frequency modulation and bi-phase waveform coding for the transmitted (interrogating) waveform for high precision range estimation of the transponder location. Compared to previous methods, optimal lower range sidelobes are achieved. Moreover, since both the transmitted (interrogating) waveform coding and transponder communication coding result in instantaneous phase modulation of the signal, cross-interference between localization and communication tasks exists. Phase discriminating algorithm is proposed to make it possible to separate the waveform coding from the communication coding, upon reception, and achieve localization with increased signal energy by up to 3 dB compared with previous reported results.
The joint communication-localization framework also enables a low-complexity receiver design because the same radio is used both for localization and communication.
Simulations comparing the performance of different codes corroborate the theoretical results and offer possible trade-off between information rate and clutter mitigation as well as a trade-off between choice of waveform-channel coding pairs. Experimental results from a brass-board microwave system in an indoor environment are also presented and discussed.
Resumo:
In this paper a new method of establishing secret keys for wireless communications is proposed. A retrodirective array (RDA) that is configured to receive and re-transmit at different frequencies is utilized as a relay node. Specifically the analogue RDA is able to respond in ‘real-time’, reducing the required number of time slots for key establishment to two, compared with at least three in previous relay key generation schemes. More importantly, in the proposed architecture equivalent reciprocal wireless channels between legitimate keying nodes can be randomly updated within one channel coherence time period, leading to greatly increased key generation rates (KGRs) in slow fading environment. The secrecy performance of this RDA assisted key generation system is evaluated and it is shown that it outperforms previous relay key generation systems.
Resumo:
The impetus for this special issue has been spurred by new technical trends in wireless communication and the need for high bit rate ( 1Gbit/s) services in future wireless systems. However, due to the limited transmit power, the wireless transmission distance cannot be too large if the data rate is very high. (...)
Resumo:
Over the past few years, the number of wireless networks users has been increasing. Until now, Radio-Frequency (RF) used to be the dominant technology. However, the electromagnetic spectrum in these region is being saturated, demanding for alternative wireless technologies. Recently, with the growing market of LED lighting, the Visible Light Communications has been drawing attentions from the research community. First, it is an eficient device for illumination. Second, because of its easy modulation and high bandwidth. Finally, it can combine illumination and communication in the same device, in other words, it allows to implement highly eficient wireless communication systems. One of the most important aspects in a communication system is its reliability when working in noisy channels. In these scenarios, the received data can be afected by errors. In order to proper system working, it is usually employed a Channel Encoder in the system. Its function is to code the data to be transmitted in order to increase system performance. It commonly uses ECC, which appends redundant information to the original data. At the receiver side, the redundant information is used to recover the erroneous data. This dissertation presents the implementation steps of a Channel Encoder for VLC. It was consider several techniques such as Reed-Solomon and Convolutional codes, Block and Convolutional Interleaving, CRC and Puncturing. A detailed analysis of each technique characteristics was made in order to choose the most appropriate ones. Simulink models were created in order to simulate how diferent codes behave in diferent scenarios. Later, the models were implemented in a FPGA and simulations were performed. Hardware co-simulations were also implemented to faster simulation results. At the end, diferent techniques were combined to create a complete Channel Encoder capable of detect and correct random and burst errors, due to the usage of a RS(255,213) code with a Block Interleaver. Furthermore, after the decoding process, the proposed system can identify uncorrectable errors in the decoded data due to the CRC-32 algorithm.
Resumo:
Over the last decade, success of social networks has significantly reshaped how people consume information. Recommendation of contents based on user profiles is well-received. However, as users become dominantly mobile, little is done to consider the impacts of the wireless environment, especially the capacity constraints and changing channel. In this dissertation, we investigate a centralized wireless content delivery system, aiming to optimize overall user experience given the capacity constraints of the wireless networks, by deciding what contents to deliver, when and how. We propose a scheduling framework that incorporates content-based reward and deliverability. Our approach utilizes the broadcast nature of wireless communication and social nature of content, by multicasting and precaching. Results indicate this novel joint optimization approach outperforms existing layered systems that separate recommendation and delivery, especially when the wireless network is operating at maximum capacity. Utilizing limited number of transmission modes, we significantly reduce the complexity of the optimization. We also introduce the design of a hybrid system to handle transmissions for both system recommended contents ('push') and active user requests ('pull'). Further, we extend the joint optimization framework to the wireless infrastructure with multiple base stations. The problem becomes much harder in that there are many more system configurations, including but not limited to power allocation and how resources are shared among the base stations ('out-of-band' in which base stations transmit with dedicated spectrum resources, thus no interference; and 'in-band' in which they share the spectrum and need to mitigate interference). We propose a scalable two-phase scheduling framework: 1) each base station obtains delivery decisions and resource allocation individually; 2) the system consolidates the decisions and allocations, reducing redundant transmissions. Additionally, if the social network applications could provide the predictions of how the social contents disseminate, the wireless networks could schedule the transmissions accordingly and significantly improve the dissemination performance by reducing the delivery delay. We propose a novel method utilizing: 1) hybrid systems to handle active disseminating requests; and 2) predictions of dissemination dynamics from the social network applications. This method could mitigate the performance degradation for content dissemination due to wireless delivery delay. Results indicate that our proposed system design is both efficient and easy to implement.
Resumo:
The proliferation of new mobile communication devices, such as smartphones and tablets, has led to an exponential growth in network traffic. The demand for supporting the fast-growing consumer data rates urges the wireless service providers and researchers to seek a new efficient radio access technology, which is the so-called 5G technology, beyond what current 4G LTE can provide. On the other hand, ubiquitous RFID tags, sensors, actuators, mobile phones and etc. cut across many areas of modern-day living, which offers the ability to measure, infer and understand the environmental indicators. The proliferation of these devices creates the term of the Internet of Things (IoT). For the researchers and engineers in the field of wireless communication, the exploration of new effective techniques to support 5G communication and the IoT becomes an urgent task, which not only leads to fruitful research but also enhance the quality of our everyday life. Massive MIMO, which has shown the great potential in improving the achievable rate with a very large number of antennas, has become a popular candidate. However, the requirement of deploying a large number of antennas at the base station may not be feasible in indoor scenarios. Does there exist a good alternative that can achieve similar system performance to massive MIMO for indoor environment? In this dissertation, we address this question by proposing the time-reversal technique as a counterpart of massive MIMO in indoor scenario with the massive multipath effect. It is well known that radio signals will experience many multipaths due to the reflection from various scatters, especially in indoor environments. The traditional TR waveform is able to create a focusing effect at the intended receiver with very low transmitter complexity in a severe multipath channel. TR's focusing effect is in essence a spatial-temporal resonance effect that brings all the multipaths to arrive at a particular location at a specific moment. We show that by using time-reversal signal processing, with a sufficiently large bandwidth, one can harvest the massive multipaths naturally existing in a rich-scattering environment to form a large number of virtual antennas and achieve the desired massive multipath effect with a single antenna. Further, we explore the optimal bandwidth for TR system to achieve maximal spectral efficiency. Through evaluating the spectral efficiency, the optimal bandwidth for TR system is found determined by the system parameters, e.g., the number of users and backoff factor, instead of the waveform types. Moreover, we investigate the tradeoff between complexity and performance through establishing a generalized relationship between the system performance and waveform quantization in a practical communication system. It is shown that a 4-bit quantized waveforms can be used to achieve the similar bit-error-rate compared to the TR system with perfect precision waveforms. Besides 5G technology, Internet of Things (IoT) is another terminology that recently attracts more and more attention from both academia and industry. In the second part of this dissertation, the heterogeneity issue within the IoT is explored. One of the significant heterogeneity considering the massive amount of devices in the IoT is the device heterogeneity, i.e., the heterogeneous bandwidths and associated radio-frequency (RF) components. The traditional middleware techniques result in the fragmentation of the whole network, hampering the objects interoperability and slowing down the development of a unified reference model for the IoT. We propose a novel TR-based heterogeneous system, which can address the bandwidth heterogeneity and maintain the benefit of TR at the same time. The increase of complexity in the proposed system lies in the digital processing at the access point (AP), instead of at the devices' ends, which can be easily handled with more powerful digital signal processor (DSP). Meanwhile, the complexity of the terminal devices stays low and therefore satisfies the low-complexity and scalability requirement of the IoT. Since there is no middleware in the proposed scheme and the additional physical layer complexity concentrates on the AP side, the proposed heterogeneous TR system better satisfies the low-complexity and energy-efficiency requirement for the terminal devices (TDs) compared with the middleware approach.
Resumo:
Cooperative collision warning system for road vehicles, enabled by recent advances in positioning systems and wireless communication technologies, can potentially reduce traffic accident significantly. To improve the system, we propose a graph model to represent interactions between multiple road vehicles in a specific region and at a specific time. Given a list of vehicles in vicinity, we can generate the interaction graph using several rules that consider vehicle's properties such as position, speed, heading, etc. Safety applications can use the model to improve emergency warning accuracy and optimize wireless channel usage. The model allows us to develop some congestion control strategies for an efficient multi-hop broadcast protocol.
Resumo:
This paper investigates the impact of carrier frequency offset (CFO) on Single Carrier wireless communication systems with Frequency Domain Equalization (SC-FDE). We show that CFO in SC-FDE systems causes irrecoverable channel estimation error, which leads to inter-symbol-interference (ISI). The impact of CFO on SC-FDE and OFDM is compared in the presence of CFO and channel estimation errors. Closed form expressions of signal to interference and noise ratio (SINR) are derived for both systems, and verified by simulation results. We find that when channel estimation errors are considered, SC-FDE is similarly or even more sensitive to CFO, compared to OFDM. In particular, in SC-FDE systems, CFO mainly deteriorates the system performance via degrading the channel estimation. Both analytical and simulation results highlight the importance of accurate CFO estimation in SC-FDE systems.
Resumo:
This paper introduces an energy-efficient Rate Adaptive MAC (RA-MAC) protocol for long-lived Wireless Sensor Networks (WSN). Previous research shows that the dynamic and lossy nature of wireless communication is one of the major challenges to reliable data delivery in a WSN. RA-MAC achieves high link reliability in such situations by dynamically trading off radio bit rate for signal processing gain. This extra gain reduces the packet loss rate which results in lower energy expenditure by reducing the number of retransmissions. RA-MAC selects the optimal data rate based on channel conditions with the aim of minimizing energy consumption. We have implemented RA-MAC in TinyOS on an off-the-shelf sensor platform (TinyNode), and evaluated its performance by comparing RA-MAC with state-ofthe- art WSN MAC protocol (SCP-MAC) by experiments.
Resumo:
This paper investigates a mobile, wireless sensor/actuator network application for use in the cattle breeding industry. Our goal is to prevent fighting between bulls in on-farm breeding paddocks by autonomously applying appropriate stimuli when one bull approaches another bull. This is an important application because fighting between high-value animals such as bulls during breeding seasons causes significant financial loss to producers. Furthermore, there are significant challenges in this type of application because it requires dynamic animal state estimation, real-time actuation and efficient mobile wireless transmissions. We designed and implemented an animal state estimation algorithm based on a state-machine mechanism for each animal. Autonomous actuation is performed based on the estimated states of an animal relative to other animals. A simple, yet effective, wireless communication model has been proposed and implemented to achieve high delivery rates in mobile environments. We evaluated the performance of our design by both simulations and field experiments, which demonstrated the effectiveness of our autonomous animal control system.
Resumo:
Harmful Algal Blooms (HABs) have become an important environmental concern along the western coast of the United States. Toxic and noxious blooms adversely impact the economies of coastal communities in the region, pose risks to human health, and cause mortality events that have resulted in the deaths of thousands of fish, marine mammals and seabirds. One goal of field-based research efforts on this topic is the development of predictive models of HABs that would enable rapid response, mitigation and ultimately prevention of these events. In turn, these objectives are predicated on understanding the environmental conditions that stimulate these transient phenomena. An embedded sensor network (Fig. 1), under development in the San Pedro Shelf region off the Southern California coast, is providing tools for acquiring chemical, physical and biological data at high temporal and spatial resolution to help document the emergence and persistence of HAB events, supporting the design and testing of predictive models, and providing contextual information for experimental studies designed to reveal the environmental conditions promoting HABs. The sensor platforms contained within this network include pier-based sensor arrays, ocean moorings, HF radar stations, along with mobile sensor nodes in the form of surface and subsurface autonomous vehicles. FreewaveTM radio modems facilitate network communication and form a minimally-intrusive, wireless communication infrastructure throughout the Southern California coastal region, allowing rapid and cost-effective data transfer. An emerging focus of this project is the incorporation of a predictive ocean model that assimilates near-real time, in situ data from deployed Autonomous Underwater Vehicles (AUVs). The model then assimilates the data to increase the skill of both nowcasts and forecasts, thus providing insight into bloom initiation as well as the movement of blooms or other oceanic features of interest (e.g., thermoclines, fronts, river discharge, etc.). From these predictions, deployed mobile sensors can be tasked to track a designated feature. This focus has led to the creation of a technology chain in which algorithms are being implemented for the innovative trajectory design for AUVs. Such intelligent mission planning is required to maneuver a vehicle to precise depths and locations that are the sites of active blooms, or physical/chemical features that might be sources of bloom initiation or persistence. The embedded network yields high-resolution, temporal and spatial measurements of pertinent environmental parameters and resulting biology (see Fig. 1). Supplementing this with ocean current information and remotely sensed imagery and meteorological data, we obtain a comprehensive foundation for developing a fundamental understanding of HAB events. This then directs labor- intensive and costly sampling efforts and analyses. Additionally, we provide coastal municipalities, managers and state agencies with detailed information to aid their efforts in providing responsible environmental stewardship of their coastal waters.