906 resultados para Link quality estimation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Urbanisation significantly changes the characteristics of a catchment as natural areas are transformed to impervious surfaces such as roads, roofs and parking lots. The increased fraction of impervious surfaces leads to changes to the stormwater runoff characteristics, whilst a variety of anthropogenic activities common to urban areas generate a range of pollutants such as nutrients, solids and organic matter. These pollutants accumulate on catchment surfaces and are removed and trans- ported by stormwater runoff and thereby contribute pollutant loads to receiving waters. In summary, urbanisation influences the stormwater characteristics of a catchment, including hydrology and water quality. Due to the growing recognition that stormwater pollution is a significant environmental problem, the implementation of mitigation strategies to improve the quality of stormwater runoff is becoming increasingly common in urban areas. A scientifically robust stormwater quality treatment strategy is an essential requirement for effective urban stormwater management. The efficient design of treatment systems is closely dependent on the state of knowledge in relation to the primary factors influencing stormwater quality. In this regard, stormwater modelling outcomes provide designers with important guidance and datasets which significantly underpin the design of effective stormwater treatment systems. Therefore, the accuracy of modelling approaches and the reliability modelling outcomes are of particular concern. This book discusses the inherent complexity and key characteristics in the areas of urban hydrology and stormwater quality, based on the influence exerted by a range of rainfall and catchment characteristics. A comprehensive field sampling and testing programme in relation to pollutant build-up, an urban catchment monitoring programme in relation to stormwater quality and the outcomes from advanced statistical analyses provided the platform for the knowledge creation. Two case studies and two real-world applications are discussed to illustrate the translation of the knowledge created to practical use in relation to the role of rainfall and catchment characteristics on urban stormwater quality. An innovative rainfall classification based on stormwater quality was developed to support the effective and scientifically robust design of stormwater treatment systems. Underpinned by the rainfall classification methodology, a reliable approach for design rainfall selection is proposed in order to optimise stormwater treatment based on both, stormwater quality and quantity. This is a paradigm shift from the common approach where stormwater treatment systems are designed based solely on stormwater quantity data. Additionally, how pollutant build-up and stormwater runoff quality vary with a range of catchment characteristics was also investigated. Based on the study out- comes, it can be concluded that the use of only a limited number of catchment parameters such as land use and impervious surface percentage, as it is the case in current modelling approaches, could result in appreciable error in water quality estimation. Influential factors which should be incorporated into modelling in relation to catchment characteristics, should also include urban form and impervious surface area distribution. The knowledge created through the research investigations discussed in this monograph is expected to make a significant contribution to engineering practice such as hydrologic and stormwater quality modelling, stormwater treatment design and urban planning, as the study outcomes provide practical approaches and recommendations for urban stormwater quality enhancement. Furthermore, this monograph also demonstrates how fundamental knowledge of stormwater quality processes can be translated to provide guidance on engineering practice, the comprehensive application of multivariate data analyses techniques and a paradigm on integrative use of computer models and mathematical models to derive practical outcomes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we present a machine learning approach to measure the visual quality of JPEG-coded images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity (HVS) factors such as edge amplitude, edge length, background activity and background luminance. Image quality assessment involves estimating the functional relationship between HVS features and subjective test scores. The quality of the compressed images are obtained without referring to their original images ('No Reference' metric). Here, the problem of quality estimation is transformed to a classification problem and solved using extreme learning machine (ELM) algorithm. In ELM, the input weights and the bias values are randomly chosen and the output weights are analytically calculated. The generalization performance of the ELM algorithm for classification problems with imbalance in the number of samples per quality class depends critically on the input weights and the bias values. Hence, we propose two schemes, namely the k-fold selection scheme (KS-ELM) and the real-coded genetic algorithm (RCGA-ELM) to select the input weights and the bias values such that the generalization performance of the classifier is a maximum. Results indicate that the proposed schemes significantly improve the performance of ELM classifier under imbalance condition for image quality assessment. The experimental results prove that the estimated visual quality of the proposed RCGA-ELM emulates the mean opinion score very well. The experimental results are compared with the existing JPEG no-reference image quality metric and full-reference structural similarity image quality metric.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we present a growing and pruning radial basis function based no-reference (NR) image quality model for JPEG-coded images. The quality of the images are estimated without referring to their original images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity factors such as edge amplitude, edge length, background activity and background luminance. Image quality estimation involves computation of functional relationship between HVS features and subjective test scores. Here, the problem of quality estimation is transformed to a function approximation problem and solved using GAP-RBF network. GAP-RBF network uses sequential learning algorithm to approximate the functional relationship. The computational complexity and memory requirement are less in GAP-RBF algorithm compared to other batch learning algorithms. Also, the GAP-RBF algorithm finds a compact image quality model and does not require retraining when the new image samples are presented. Experimental results prove that the GAP-RBF image quality model does emulate the mean opinion score (MOS). The subjective test results of the proposed metric are compared with JPEG no-reference image quality index as well as full-reference structural similarity image quality index and it is observed to outperform both.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An important question in kernel regression is one of estimating the order and bandwidth parameters from available noisy data. We propose to solve the problem within a risk estimation framework. Considering an independent and identically distributed (i.i.d.) Gaussian observations model, we use Stein's unbiased risk estimator (SURE) to estimate a weighted mean-square error (MSE) risk, and optimize it with respect to the order and bandwidth parameters. The two parameters are thus spatially adapted in such a manner that noise smoothing and fine structure preservation are simultaneously achieved. On the application side, we consider the problem of image restoration from uniform/non-uniform data, and show that the SURE approach to spatially adaptive kernel regression results in better quality estimation compared with its spatially non-adaptive counterparts. The denoising results obtained are comparable to those obtained using other state-of-the-art techniques, and in some scenarios, superior.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The proliferation of multimedia content and the demand for new audio or video services have fostered the development of a new era based on multimedia information, which allowed the evolution of Wireless Multimedia Sensor Networks (WMSNs) and also Flying Ad-Hoc Networks (FANETs). In this way, live multimedia services require realtime video transmissions with a low frame loss rate, tolerable end-to-end delay, and jitter to support video dissemination with Quality of Experience (QoE) support. Hence, a key principle in a QoE-aware approach is the transmission of high priority frames (protect them) with a minimum packet loss ratio, as well as network overhead. Moreover, multimedia content must be transmitted from a given source to the destination via intermediate nodes with high reliability in a large scale scenario. The routing service must cope with dynamic topologies caused by node failure or mobility, as well as wireless channel changes, in order to continue to operate despite dynamic topologies during multimedia transmission. Finally, understanding user satisfaction on watching a video sequence is becoming a key requirement for delivery of multimedia content with QoE support. With this goal in mind, solutions involving multimedia transmissions must take into account the video characteristics to improve video quality delivery. The main research contributions of this thesis are driven by the research question how to provide multimedia distribution with high energy-efficiency, reliability, robustness, scalability, and QoE support over wireless ad hoc networks. The thesis addresses several problem domains with contributions on different layers of the communication stack. At the application layer, we introduce a QoE-aware packet redundancy mechanism to reduce the impact of the unreliable and lossy nature of wireless environment to disseminate live multimedia content. At the network layer, we introduce two routing protocols, namely video-aware Multi-hop and multi-path hierarchical routing protocol for Efficient VIdeo transmission for static WMSN scenarios (MEVI), and cross-layer link quality and geographical-aware beaconless OR protocol for multimedia FANET scenarios (XLinGO). Both protocols enable multimedia dissemination with energy-efficiency, reliability and QoE support. This is achieved by combining multiple cross-layer metrics for routing decision in order to establish reliable routes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The proliferation of multimedia content and the demand for new audio or video services have fostered the development of a new era based on multimedia information, which allowed the evolution of Wireless Multimedia Sensor Networks (WMSNs) and also Flying Ad-Hoc Networks (FANETs). In this way, live multimedia services require real-time video transmissions with a low frame loss rate, tolerable end-to-end delay, and jitter to support video dissemination with Quality of Experience (QoE) support. Hence, a key principle in a QoE-aware approach is the transmission of high priority frames (protect them) with a minimum packet loss ratio, as well as network overhead. Moreover, multimedia content must be transmitted from a given source to the destination via intermediate nodes with high reliability in a large scale scenario. The routing service must cope with dynamic topologies caused by node failure or mobility, as well as wireless channel changes, in order to continue to operate despite dynamic topologies during multimedia transmission. Finally, understanding user satisfaction on watching a video sequence is becoming a key requirement for delivery of multimedia content with QoE support. With this goal in mind, solutions involving multimedia transmissions must take into account the video characteristics to improve video quality delivery. The main research contributions of this thesis are driven by the research question how to provide multimedia distribution with high energy-efficiency, reliability, robustness, scalability, and QoE support over wireless ad hoc networks. The thesis addresses several problem domains with contributions on different layers of the communication stack. At the application layer, we introduce a QoE-aware packet redundancy mechanism to reduce the impact of the unreliable and lossy nature of wireless environment to disseminate live multimedia content. At the network layer, we introduce two routing protocols, namely video-aware Multi-hop and multi-path hierarchical routing protocol for Efficient VIdeo transmission for static WMSN scenarios (MEVI), and cross-layer link quality and geographical-aware beaconless OR protocol for multimedia FANET scenarios (XLinGO). Both protocols enable multimedia dissemination with energy-efficiency, reliability and QoE support. This is achieved by combining multiple cross-layer metrics for routing decision in order to establish reliable routes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The widespread use of wireless enabled devices and the increasing capabilities of wireless technologies has promoted multimedia content access and sharing among users. However, the quality perceived by the users still depends on multiple factors such as video characteristics, device capabilities, and link quality. While video characteristics include the video time and spatial complexity as well as the coding complexity, one of the most important device characteristics is the battery lifetime. There is the need to assess how these aspects interact and how they impact the overall user satisfaction. This paper advances previous works by proposing and validating a flexible framework, named EViTEQ, to be applied in real testbeds to satisfy the requirements of performance assessment. EViTEQ is able to measure network interface energy consumption with high precision, while being completely technology independent and assessing the application level quality of experience. The results obtained in the testbed show the relevance of combined multi-criteria measurement approaches, leading to superior end-user satisfaction perception evaluation .

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Link adaptation is a critical component of IEEE 802.11 systems, which adapts transmission rates to dynamic wireless channel conditions. In this paper we investigate a general cross-layer link adaptation algorithm which jointly considers the physical layer link quality and random channel access at the MAC layer. An analytic model is proposed for the link adaptation algorithm. The underlying wireless channel is modeled with a multiple state discrete time Markov chain. Compared with the pure link quality based link adaptation algorithm, the proposed cross-layer algorithm can achieve considerable performance gains of up to 20%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Link quality-based rate adaptation has been widely used for IEEE 802.11 networks. However, network performance is affected by both link quality and random channel access. Selection of transmit modes for optimal link throughput can cause medium access control (MAC) throughput loss. In this paper, we investigate this issue and propose a generalised cross-layer rate adaptation algorithm. It considers jointly link quality and channel access to optimise network throughput. The objective is to examine the potential benefits by cross-layer design. An efficient analytic model is proposed to evaluate rate adaptation algorithms under dynamic channel and multi-user access environments. The proposed algorithm is compared to link throughput optimisation-based algorithm. It is found rate adaptation by optimising link layer throughput can result in large performance loss, which cannot be compensated by the means of optimising MAC access mechanism alone. Results show cross-layer design can achieve consistent and considerable performance gains of up to 20%. It deserves to be exploited in practical design for IEEE 802.11 networks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The wide adaptation of Internet Protocol (IP) as de facto protocol for most communication networks has established a need for developing IP capable data link layer protocol solutions for Machine to machine (M2M) and Internet of Things (IoT) networks. However, the wireless networks used for M2M and IoT applications usually lack the resources commonly associated with modern wireless communication networks. The existing IP capable data link layer solutions for wireless IoT networks provide the necessary overhead minimising and frame optimising features, but are often built to be compatible only with IPv6 and specific radio platforms. The objective of this thesis is to design IPv4 compatible data link layer for Netcontrol Oy's narrow band half-duplex packet data radio system. Based on extensive literature research, system modelling and solution concept testing, this thesis proposes the usage of tunslip protocol as the basis for the system data link layer protocol development. In addition to the functionality of tunslip, this thesis discusses the additional network, routing, compression, security and collision avoidance changes required to be made to the radio platform in order for it to be IP compatible while still being able to maintain the point-to-multipoint and multi-hop network characteristics. The data link layer design consists of the radio application, dynamic Maximum Transmission Unit (MTU) optimisation daemon and the tunslip interface. The proposed design uses tunslip for creating an IP capable data link protocol interface. The radio application receives data from tunslip and compresses the packets and uses the IP addressing information for radio network addressing and routing before forwarding the message to radio network. The dynamic MTU size optimisation daemon controls the tunslip interface maximum MTU size according to the link quality assessment calculated from the radio network diagnostic data received from the radio application. For determining the usability of tunslip as the basis for data link layer protocol, testing of the tunslip interface is conducted with both IEEE 802.15.4 radios and packet data radios. The test cases measure the radio network usability for User Datagram Protocol (UDP) based applications without applying any header or content compression. The test results for the packet data radios reveal that the typical success rate for packet reception through a single-hop link is above 99% with a round-trip-delay of 0.315s for 63B packets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Network induced delay in networked control systems (NCS) is inherently non-uniformly distributed and behaves with multifractal nature. However, such network characteristics have not been well considered in NCS analysis and synthesis. Making use of the information of the statistical distribution of NCS network induced delay, a delay distribution based stochastic model is adopted to link Quality-of-Control and network Quality-of-Service for NCS with uncertainties. From this model together with a tighter bounding technology for cross terms, H∞ NCS analysis is carried out with significantly improved stability results. Furthermore, a memoryless H∞ controller is designed to stabilize the NCS and to achieve the prescribed disturbance attenuation level. Numerical examples are given to demonstrate the effectiveness of the proposed method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the past few years, numerous data collection protocols have been developed for wireless sensor networks (WSNs). However, there has been no comparison of their relative performance in realistic environments. Here we report the results of an empirical study using a Fleck3 sensor network testbed for four different data collection protocols: One phase pull Directed Diffusion (DD), Expected Number of Transmissions (ETX), ETX with explicit acknowledgment (ETX-eAck), and ETX with implicit acknowledgment (ETX-iAck). Our empirical study provides useful insights for future sensor network deployments. When the required application end-to-end reliability is not strict (e.g., 70%) and link quality is good, DD and ETX are the best options because of their simplicity and low routing overhead. Both ETX-eAck and ETX-iAck achieve more than 90% end-to-end reliability when the link quality is reasonable (less than 25% packet loss). When the link quality is good, ETX-iAck introduces significantly less routing overhead (up to 50%) than ETX-eAck. However, if the radio transceiver supports variable packet length, ETX-eAck can outperform ETX-iAck when the link quality is poor. The important message from this paper is that choice of data collection protocol should come after the operating environment is understood. This understanding must include the characteristics of the radio transceiver, and link loss statistics from a long-term (across seasons and weather variation) radio survey of the site.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The validity of using rainfall characteristics as lumped parameters for investigating the pollutant wash-off process such as first flush occurrence is questionable. This research study introduces an innovative concept of using sector parameters to investigate the relationship between the pollutant wash-off process and different sectors of the runoff hydrograph and rainfall hyetograph. The research outcomes indicated that rainfall depth and rainfall intensity are two key rainfall characteristics which influence the wash-off process compared to the antecedent dry period. Additionally, the rainfall pattern also plays a critical role in the wash-off process and is independent of the catchment characteristics. The knowledge created through this research study provides the ability to select appropriate rainfall events for stormwater quality treatment design based on the required treatment outcomes such as the need to target different sectors of the runoff hydrograph or pollutant species. The study outcomes can also contribute to enhancing stormwater quality modelling and prediction in view of the fact that conventional approaches to stormwater quality estimation is primarily based on rainfall intensity rather than considering other rainfall parameters or solely based on stochastic approaches irrespective of the characteristics of the rainfall event.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between site characteristics and understorey vegetation composition was analysed with quantitative methods, especially from the viewpoint of site quality estimation. Theoretical models were applied to an empirical data set collected from the upland forests of southern Finland comprising 104 sites dominated by Scots pine (Pinus sylvestris L.), and 165 sites dominated by Norway spruce (Picea abies (L.) Karsten). Site index H100 was used as an independent measure of site quality. A new model for the estimation of site quality at sites with a known understorey vegetation composition was introduced. It is based on the application of Bayes' theorem to the density function of site quality within the study area combined with the species-specific presence-absence response curves. The resulting posterior probability density function may be used for calculating an estimate for the site variable. Using this method, a jackknife estimate of site index H100 was calculated separately for pine- and spruce-dominated sites. The results indicated that the cross-validation root mean squared error (RMSEcv) of the estimates improved from 2.98 m down to 2.34 m relative to the "null" model (standard deviation of the sample distribution) in pine-dominated forests. In spruce-dominated forests RMSEcv decreased from 3.94 m down to 3.16 m. In order to assess these results, four other estimation methods based on understorey vegetation composition were applied to the same data set. The results showed that none of the methods was clearly superior to the others. In pine-dominated forests, RMSEcv varied between 2.34 and 2.47 m, and the corresponding range for spruce-dominated forests was from 3.13 to 3.57 m.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a typical enterprise WLAN, a station has a choice of multiple access points to associate with. The default association policy is based on metrics such as Re-ceived Signal Strength(RSS), and “link quality” to choose a particular access point among many. Such an approach can lead to unequal load sharing and diminished system performance. We consider the RAT (Rate And Throughput) policy [1] which leads to better system performance. The RAT policy has been implemented on home-grown centralized WLAN controller, ADWISER [2] and we demonstrate that the RAT policy indeed provides a better system performance.