924 resultados para Packet-forwarding scheme
Resumo:
Alzaid et al. proposed a forward & backward secure key management scheme in wireless sensor networks for Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems. The scheme, however, is still vulnerable to an attack called the sandwich attack that can be launched when the adversary captures two sensor nodes at times t1 and t2, and then reveals all the group keys used between times t1 and t2. In this paper, a fix to the scheme is proposed in order to limit the vulnerable time duration to an arbitrarily chosen time span while keeping the forward and backward secrecy of the scheme untouched. Then, the performance analysis for our proposal, Alzaid et al.’s scheme, and Nilsson et al.’s scheme is given.
Resumo:
Vehicular ad hoc network (VANET) is a wireless ad hoc network that operates in a vehicular environment to provide communication between vehicles. VANET can be used by a diverse range of applications to improve road safety. Cooperative collision warning system (CCWS) is one of the safety applications that can provide situational awareness and warning to drivers by exchanging safety messages between cooperative vehicles. Currently, the routing strategies for safety message dissemination in CCWS are scoped broadcast. However, the broadcast schemes are not efficient as a warning message is sent to a large number of vehicles in the area, rather than only the endangered vehicles. They also cannot prioritize the receivers based on their critical time to avoid collision. This paper presents a more efficient multicast routing scheme that can reduce unnecessary transmissions and also use adaptive transmission range. The multicast scheme involves methods to identify an abnormal vehicle, the vehicles that may be endangered by the abnormal vehicle, and the latest time for each endangered vehicle to receive the warning message in order to avoid the danger. We transform this multicast routing problem into a delay-constrained minimum Steiner tree problem. Therefore, we can use existing algorithms to solve the problem. The advantages of our multicast routing scheme are mainly its potential to support various road traffic scenarios, to optimize the wireless channel utilization, and to prioritize the receivers.
Resumo:
The previously distinct boundary between airports and their cities has become increasingly blurred as new interests and actors are identified as important stakeholders in the decision making process. As a consequence airport entities are more than ever seeking an integrated existence with their surrounding regions. While current planning strategies provide insights on how to improve and leverage land use planning in and around airports, emerging challenges for implementing and protecting these planning ideals stem from the governance shadows of development decisions. The thesis of this paper is that improving the identification, articulation and consideration of city and airport interests in the development approval process (between planning and implementation) can help avoid outcomes that hinder the ability of cities and their airports to meet their separate/mutual long-term objectives. By applying a network governance perspective to the pilot case study of Brisbane, analysis of overlapping and competing actor interests show how different governance arrangements facilitate (or impede) decision making that protects sustainable ‘airport region’ development. ---------- Contributions are made to airport and city development decision makers through the identification and analysis of effective and ineffective decision making pathways, and to governance literature by way of forwarding empirically derived frameworks for showing how actors protect their interests in the ‘crowded decision making domain’ of airport region development. This work was carried out through the Airport Metropolis Research Project under the Australian Research Council’s Linkage Projects funding scheme (LP0775225).
Resumo:
TCP is a dominant protocol for consistent communication over the internet. It provides flow, congestion and error control mechanisms while using wired reliable networks. Its congestion control mechanism is not suitable for wireless links where data corruption and its lost rate are higher. The physical links are transparent from TCP that takes packet losses due to congestion only and initiates congestion handling mechanisms by reducing transmission speed. This results in wasting already limited available bandwidth on the wireless links. Therefore, there is no use to carry out research on increasing bandwidth of the wireless links until the available bandwidth is not optimally utilized. This paper proposed a hybrid scheme called TCP Detection and Recovery (TCP-DR) to distinguish congestion, corruption and mobility related losses and then instructs the data sending host to take appropriate action. Therefore, the link utilization is optimal while losses are either due to high bit error rate or mobility.
Resumo:
Islanded operation, protection, reclosing and arc extinguishing are some of the challenging issues related to the connection of converter interfaced distributed generators (DGs) into a distribution network. The isolation of upstream faults in grid connected mode and fault detection in islanded mode using overcurrent devices are difficult. In the event of an arc fault, all DGs must be disconnected in order to extinguish the arc. Otherwise, they will continue to feed the fault, thus sustaining the arc. However, the system reliability can be increased by maximising the DG connectivity to the system: therefore, the system protection scheme must ensure that only the faulted segment is removed from the feeder. This is true even in the case of a radial feeder as the DG can be connected at various points along the feeder. In this paper, a new relay scheme is proposed which, along with a novel current control strategy for converter interfaced DGs, can isolate permanent and temporary arc faults. The proposed protection and control scheme can even coordinate with reclosers. The results are validated through PSCAD/EMTDC simulation and MATLAB calculations.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
This paper proposes a novel peak load management scheme for rural areas. The scheme transfers certain customers onto local nonembedded generators during peak load periods to alleviate network under voltage problems. This paper develops and presents this system by way of a case study in Central Queensland, Australia. A methodology is presented for determining the best location for the nonembedded generators as well as the number of generators required to alleviate network problems. A control algorithm to transfer and reconnect customers is developed to ensure that the network voltage profile remains within specification under all plausible load conditions. Finally, simulations are presented to show the performance of the system over a typical maximum daily load profile with large stochastic load variations.
Resumo:
A special transmit polarization signalling scheme is presented to alleviate the power reduction as a result of polarization mismatch from random antenna orientations. This is particularly useful for hand held mobile terminals typically equipped with only a single linearly polarized antenna, since the average signal power is desensitized against receiver orientations. Numerical simulations also show adequate robustness against incorrect channel estimations.
Resumo:
In the past eight years, Australia has adopted the use of environmental offsets as a means to compensate for environmental degradation from development. Queensland has more environmental offsetting policies than any other Australian State or Territory. The methodology has profound effects on development companies, landowners (both private and public), regional land planning, organizations, government agencies, monetary banking institutions and environmental conservation bodies.
Resumo:
Technology-mediated collaboration process has been extensively studied for over a decade. Most applications with collaboration concepts reported in the literature focus on enhancing efficiency and effectiveness of the decision-making processes in objective and well-structured workflows. However, relatively few previous studies have investigated the applications of collaboration schemes to problems with subjective and unstructured nature. In this paper, we explore a new intelligent collaboration scheme for fashion design which, by nature, relies heavily on human judgment and creativity. Techniques such as multicriteria decision making, fuzzy logic, and artificial neural network (ANN) models are employed. Industrial data sets are used for the analysis. Our experimental results suggest that the proposed scheme exhibits significant improvement over the traditional method in terms of the time–cost effectiveness, and a company interview with design professionals has confirmed its effectiveness and significance.
Resumo:
It could be said that road congestion is one of the most significant problems within any modern metropolitan area. For several decades now, around the globe, congestion in metropolitan areas has been worsening for two main reasons. Firstly, road congestion has significantly increased due to a higher demand for road space because of growth in populations, economic activity and incomes (Hensher & Puckett, 2007). This factor, in conjunction with a significant lack of investment in new road and public transport infrastructure, has seen the road network capacities of cities exceeded by traffic volumes and thus, resulted in increased traffic congestion. This relentless increase in road traffic congestion has resulted in a dramatic increase in costs for both the road users and ultimately the metropolitan areas concerned (Bureau of Transport and Regional Economics, 2007). In response to this issue, several major cities around the world, including London, Stockholm and Singapore, have implemented congestion-charging schemes in order to combat the effects of road congestion. A congestion-charging scheme provides a mechanism for regulating traffic flows into the congested areas of a city, whilst simultaneously generating public revenue that can be used to improve both the public transport and road networks of the region. The aim of this paper was to assess the concept of congestion-charging, whilst reflecting on the experiences of various cities that have already implemented such systems. The findings from this paper have been used to inform the design of a congestion-charging scheme for the city of Brisbane in Australia in a supplementary study (Whitehead, Bunker, & Chung, 2011). The first section of this paper examines the background to road congestion; the theory behind different congestion-charging schemes; and the various technologies involved with the concept. The second section of this paper details the experiences, in relation to implementing a congestion-charging scheme, from the city of Stockholm in Sweden. This research has been crucial in forming a list of recommendations and lessons learnt for the design of a congestion-charging scheme in Australia. It is these recommendations that directly inform the proposed design of the Brisbane Cordon Scheme detailed in Whitehead et al. (2011).
Resumo:
As detailed in Whitehead, Bunker and Chung (2011), a congestion-charging scheme provides a mechanism to combat congestion whilst simultaneously generating revenue to improve both the road and public transport networks. The aim of this paper is to assess the feasibility of implementing a congestion-charging scheme in the city of Brisbane in Australia and determine the potential effects of this initiative. In order to so, a congestion-charging scheme was designed for Brisbane and modelled using the Brisbane Strategic Transport Model with a base line year of 2026. This paper argues that the implementation of this initiative would prove to be effective in reducing the cities road congestion and increasing the overall sustainability of the region.