20 resultados para downlink LTE schedulers
em Aston University Research Archive
Resumo:
We propose a Wiener-Hammerstein (W-H) channel estimation algorithm for Long-Term Evolution (LTE) systems. The LTE standard provides known data as pilot symbols and exploits them through coherent detection to improve system performance. These drivers are placed in a hybrid way to cover up both time and frequency domain. Our aim is to adapt the W-H equalizer (W-H/E) to LTE standard for compensation of both linear and nonlinear effects induced by power amplifiers and multipath channels. We evaluate the performance of the W-H/E for a Downlink LTE system in terms of BLER, EVM and Throughput versus SNR. Afterwards, we compare the results with a traditional Least-Mean Square (LMS) equalizer. It is shown that W-H/E can significantly reduce both linear and nonlinear distortions compared to LMS and improve LTE Downlink system performance.
Resumo:
It is desirable that energy performance improvement is not realized at the expense of other network performance parameters. This paper investigates the trade off between energy efficiency, spectral efficiency and user QoS performance for a multi-cell multi-user radio access network. Specifically, the energy consumption ratio (ECR) and the spectral efficiency of several common frequency domain packet schedulers in a cellular E-UTRAN downlink are compared for both the SISO transmission mode and the 2x2 Alamouti Space Frequency Block Code (SFBC) MIMO transmission mode. It is well known that the 2x2 SFBC MIMO transmission mode is more spectrally efficient compared to the SISO transmission mode, however, the relationship between energy efficiency and spectral efficiency is undecided. It is shown that, for the E-UTRAN downlink with fixed transmission power, spectral efficiency improvement results into energy efficiency improvement. The effect of SFBC MIMO versus SISO on the user QoS performance is also studied. © 2011 IEEE.
Resumo:
In this paper, we experimentally demonstrate the seamless integration of full duplex system frequency division duplex (FDD) long-term evolution (LTE) technology with radio over fiber (RoF) for eNodeB (eNB) coverage extension. LTE is composed of quadrature phase-shift keying (QPSK), 16-quadrature amplitude modulation (16-QAM) and 64-QAM, modulated onto orthogonal frequency division multiplexing (OFDM) and single-carrier-frequency division multiplexing for downlink (DL) and uplink (UL) transmissions, respectively. The RoF system is composed of dedicated directly modulated lasers for DL and UL with dense wavelength division multiplexing (DWDM) for instantaneous connections and for Rayleigh backscattering and nonlinear interference mitigation. DL and UL signals have varying carrier frequencies and are categorized as broad frequency spacing (BFS), intermediate frequency spacing (IFS), and narrow frequency spacing (NFS). The adjacent channel leakage ratio (ACLR) for DL and UL with 64-QAM are similar for all frequency spacings while cross talk is observed for NFS. For the best case scenario for DL and UL transmissions we achieve error vector magnitude (EVM) values of ~2.30%, ~2.33%, and ~2.39% for QPSK, 16-QAM, and 64-QAM, respectively, while for the worst case scenario with a NFS EVM is increased by 0.40% for all schemes. © 2009-2012 OSA.
Resumo:
In this paper, the implementation aspects and constraints of the simplest network coding (NC) schemes for a two-way relay channel (TWRC) composed of a user equipment (mobile terminal), an LTE relay station (RS) and an LTE base station (eNB) are considered in order to assess the usefulness of the NC in more realistic scenarios. The information exchange rate gain (IERG), the energy reduction gain (ERG) and the resource utilization gain (RUG) of the NC schemes with and without subcarrier division duplexing (SDD) are obtained by computer simulations. The usefulness of the NC schemes are evaluated for varying traffic load levels, the geographical distances between the nodes, the RS transmit powers, and the maximum numbers of retransmissions. Simulation results show that the NC schemes with and without SDD, have the throughput gains 0.5% and 25%, the ERGs 7 - 12% and 16 - 25%, and the RUGs 0.5 - 3.2%, respectively. It is found that the NC can provide performance gains also for the users at the cell edge. Furthermore, the ERGs of the NC increase with the transmit power of the relay while the ERGs of the NC remain the same even when the maximum number of retransmissions is reduced.
Resumo:
This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.
Resumo:
In this paper a new approach to the resource allocation and scheduling mechanism that reflects the effect of user's Quality of Experience is presented. The proposed scheduling algorithm is examined in the context of 3GPP Long Term Evolution (LTE) system. Pause Intensity (PI) as an objective and no-reference quality assessment metric is employed to represent user's satisfaction in the scheduler of eNodeB. PI is in fact a measurement of discontinuity in the service. The performance of the scheduling method proposed is compared with two extreme cases: maxCI and Round Robin scheduling schemes which correspond to the efficiency and fairness oriented mechanisms, respectively. Our work reveals that the proposed method is able to perform between fairness and efficiency requirements, in favor of higher satisfaction for the users to the desired level. © VDE VERLAG GMBH.
Resumo:
Owing to the limited cell size of eNodeB (eNB), the relay node has emerged as an attractive solution for the long-term evolution (LTE) system. The nonlinear limit of the alternative method to multipleinput and multiple-output (MIMO) based on frequency division multiplexing (FDM) for orthogonal FDM (OFDM) is analysed over varying transmission spans. In this reported work, it is shown that the degradation pattern over the linear, intermixing and nonlinear propagation regions is consistent for the 2 and the 2.6 GHz bands. The proposed bands experienced a linear increase in the error vector magnitude (EVM) for both the linear and the nonlinear regions proportional to the increasing transmission spans. In addition, an optical launch power between -2 and 2 dBm achieved a significantly lower EVM than the LTE limit of 8% for the 10-60 km spans. © The Institution of Engineering and Technology 2014.
Resumo:
Mobile WiFi devices are becoming increasingly popular in non-seamless and user-controlled mobile traffic offloading alongside the standard WiFi hotspots. Unlike the operator-controlled hotspots, a mobile WiFi device relies on the capacity of the macro-cell for the data rate allocated to it. This type of devices can help offloading data traffic from the macro-cell base station and serve the end users within a closer range, but will change the pattern of resource distributions operated by the base station. We propose a resource allocation scheme that aims to optimize user quality of experience (QoE) when accessing video services in the environment where traffic offloading is taking place through interworking between a mobile communication system and low range wireless LANs. In this scheme, a rate redistribution algorithm is derived to perform scheduling which is controlled by a no-reference quality assessment metric in order to achieve the desired trade-offs between efficiency and fairness. We show the performance of this algorithm in terms of the distribution of the allocated data rates throughout the macro-cell investigated and the service coverage offered by the WiFi access point.
Resumo:
Mobile communication and networking infrastructures play an important role in the development of smart cities, to support real-time information exchange and management required in modern urbanization. Mobile WiFi devices that help offloading data traffic from the macro-cell base station and serve the end users within a closer range can significantly improve the connectivity of wireless communications between essential components including infrastructural and human devices in a city. However, this offloading function through interworking between LTE and WiFi systems will change the pattern of resource distributions operated by the base station. In this paper, a resource allocation scheme is proposed to ensure stable service coverage and end-user quality of experience (QoE) when offloading takes place in a macro-cell environment. In this scheme, a rate redistribution algorithm is derived to form a parametric scheduler to meet the required levels of efficiency and fairness, guided by a no-reference quality assessment metric. We show that the performance of resource allocation can be regulated by this scheduler without affecting the service coverage offered by the WLAN access point. The performances of different interworking scenarios and macro-cell scheduling policies are also compared.
Resumo:
In this paper, a review on radio-over-fiber (RoF) technology is conducted to support the exploding growth of mobile broadband. An RoF system will provide a platform for distributed antenna system (DAS) as a fronthaul of long term evolution (LTE) technology. A higher splitting ratio from a macrocell is required to support large DAS topology, hence higher optical launch power (OLP) is the right approach. However, high OLP generates undesired nonlinearities, namely the stimulated Brillouin scattering (SBS). Three different aspects of solving the SBS process are covered in this paper, where the solutions ultimately provided an additional 4 dB link budget.
Resumo:
In the Ventrobasal (VB) thalamus, astrocytes are known to elicit NMDA-receptor mediated slow inward currents (SICs) spontaneously in neurons. Fluorescence imaging of astrocytes and patch clamp recordings from the thalamocortical (TC) neurons in the VB of 6-23 day old Wistar rats were performed. TC neurons exhibit spontaneous SICs at low frequencies (~0.0015Hz) that were inhibited by NMDA-receptor antagonists D-AP5 (50µM), and were insensitive to TTX (1µM) suggesting a non-neuronal origin. The effect of corticothalamic (CT) and sensory (Sen) afferent stimulation on astrocyte signalling was assessed by varying stimulus parameters. Moderate synaptic stimulation elicited astrocytic Ca2+ increases, but did not affect the incidence of spontaneous SICs. Prolonged synaptic stimulation induced a 265% increase in SIC frequency. This increase lasted over one hour after the cessation of synaptic stimulation, so revealing a Long Term Enhancement (LTE) of astrocyte-neuron signalling. LTE induction required group I mGluR activation. LTE SICs targeted NMDA-receptors located at extrasynaptic sites. LTE showed a developmental profile: from weeks 1-3, the SIC frequency was increased by an average 50%, 240% and 750% respectively. Prolonged exposure to glutamate (200µM) increased spontaneous SIC frequency by 1800%. This “chemical” form of LTE was prevented by the broad-spectrum excitatory amino acid transporter (EAAT) inhibitor TBOA (300µM) suggesting that glutamate uptake was a critical factor. My results therefore show complex glutamatergic signalling interactions between astrocytes and neurons. Furthermore, two previously unrecognised mechanisms of enhancing SIC frequency are described. The synaptically induced LTE represents a form of non-synaptic plasticity and a glial “memory” of previous synaptic activity whilst enhancement after prolonged glutamate exposure may represent a pathological glial signalling mechanism.
Resumo:
In this paper a Markov chain based analytical model is proposed to evaluate the slotted CSMA/CA algorithm specified in the MAC layer of IEEE 802.15.4 standard. The analytical model consists of two two-dimensional Markov chains, used to model the state transition of an 802.15.4 device, during the periods of a transmission and between two consecutive frame transmissions, respectively. By introducing the two Markov chains a small number of Markov states are required and the scalability of the analytical model is improved. The analytical model is used to investigate the impact of the CSMA/CA parameters, the number of contending devices, and the data frame size on the network performance in terms of throughput and energy efficiency. It is shown by simulations that the proposed analytical model can accurately predict the performance of slotted CSMA/CA algorithm for uplink, downlink and bi-direction traffic, with both acknowledgement and non-acknowledgement modes.
Resumo:
HSDPA (High-Speed Downlink Packet Access) is a 3.5-generation asynchronous mobile communications service based on the third generation of W-CDMA. In Korea, it is mainly provided in through videophone service. Because of the diffusion of more powerful and diversified services, along with steep advances in mobile communications technology, consumers demand a wide range of choices. However, because of the variety of technologies, which tend to overflow the market regardless of consumer preferences, consumers feel increasingly confused. Therefore, we should not adopt strategies that focus only on developing new technology on the assumption that new technologies are next-generation projects. Instead, we should understand the process by which consumers accept new forms of technology and devise schemes to lower market entry barriers through strategies that enable developers to understand and provide what consumers really want.
Resumo:
The objective of this paper is to combine the antenna downtilt selection with the cell size selection in order to reduce the overall radio frequency (RF) transmission power in the homogeneous High-Speed Packet Downlink (HSDPA) cellular radio access network (RAN). The analysis is based on the concept of small cells deployment. The energy consumption ratio (ECR) and the energy reduction gain (ERG) of the cellular RAN are calculated for different antenna tilts when the cell size is being reduced for a given user density and service area. The results have shown that a suitable antenna tilt and the RF power setting can achieve an overall energy reduction of up to 82.56%. Equally, our results demonstrate that a small cell deployment can considerably reduce the overall energy consumption of a cellular network.
Resumo:
Wireless Mesh Networks (WMNs) have emerged as a key technology for the next generation of wireless networking. Instead of being another type of ad-hoc networking, WMNs diversify the capabilities of ad-hoc networks. Several protocols that work over WMNs include IEEE 802.11a/b/g, 802.15, 802.16 and LTE-Advanced. To bring about a high throughput under varying conditions, these protocols have to adapt their transmission rate. In this paper, we have proposed a scheme to improve channel conditions by performing rate adaptation along with multiple packet transmission using packet loss and physical layer condition. Dynamic monitoring, multiple packet transmission and adaptation to changes in channel quality by adjusting the packet transmission rates according to certain optimization criteria provided greater throughput. The key feature of the proposed method is the combination of the following two factors: 1) detection of intrinsic channel conditions by measuring the fluctuation of noise to signal ratio via the standard deviation, and 2) the detection of packet loss induced through congestion. We have shown that the use of such techniques in a WMN can significantly improve performance in terms of the packet sending rate. The effectiveness of the proposed method was demonstrated in a simulated wireless network testbed via packet-level simulation.