69 resultados para Methods engineering.

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Attribute-Based Encryption (ABE) is a promising cryptographic primitive which significantly enhances the versatility of access control mechanisms. Due to the high expressiveness of ABE policies, the computational complexities of ABE key-issuing and decryption are getting prohibitively high. Despite that the existing Outsourced ABE solutions are able to offload some intensive computing tasks to a third party, the verifiability of results returned from the third party has yet to be addressed. Aiming at tackling the challenge above, we propose a new Secure Outsourced ABE system, which supports both secure outsourced key-issuing and decryption. Our new method offloads all access policy and attribute related operations in the key-issuing process or decryption to a Key Generation Service Provider (KGSP) and a Decryption Service Provider (DSP), respectively, leaving only a constant number of simple operations for the attribute authority and eligible users to perform locally. In addition, for the first time, we propose an outsourced ABE construction which provides checkability of the outsourced computation results in an efficient way. Extensive security and performance analysis show that the proposed schemes are proven secure and practical. © 2013 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper initiates the study of two specific security threats on smart-card-based password authentication in distributed systems. Smart-card-based password authentication is one of the most commonly used security mechanisms to determine the identity of a remote client, who must hold a valid smart card and the corresponding password to carry out a successful authentication with the server. The authentication is usually integrated with a key establishment protocol and yields smart-card-based password-authenticated key agreement. Using two recently proposed protocols as case studies, we demonstrate two new types of adversaries with smart card: 1) adversaries with pre-computed data stored in the smart card, and 2) adversaries with different data (with respect to different time slots) stored in the smart card. These threats, though realistic in distributed systems, have never been studied in the literature. In addition to point out the vulnerabilities, we propose the countermeasures to thwart the security threats and secure the protocols. © 2013 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Texture classification is one of the most important tasks in computer vision field and it has been extensively investigated in the last several decades. Previous texture classification methods mainly used the template matching based methods such as Support Vector Machine and k-Nearest-Neighbour for classification. Given enough training images the state-of-the-art texture classification methods could achieve very high classification accuracies on some benchmark databases. However, when the number of training images is limited, which usually happens in real-world applications because of the high cost of obtaining labelled data, the classification accuracies of those state-of-the-art methods would deteriorate due to the overfitting effect. In this paper we aim to develop a novel framework that could correctly classify textural images with only a small number of training images. By taking into account the repetition and sparsity property of textures we propose a sparse representation based multi-manifold analysis framework for texture classification from few training images. A set of new training samples are generated from each training image by a scale and spatial pyramid, and then the training samples belonging to each class are modelled by a manifold based on sparse representation. We learn a dictionary of sparse representation and a projection matrix for each class and classify the test images based on the projected reconstruction errors. The framework provides a more compact model than the template matching based texture classification methods, and mitigates the overfitting effect. Experimental results show that the proposed method could achieve reasonably high generalization capability even with as few as 3 training images, and significantly outperforms the state-of-the-art texture classification approaches on three benchmark datasets. © 2014 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Several grouping proof protocols for RFID systems have been proposed over the years but they are either found to be vulnerable to certain attacks or do not comply with the EPC class-1 gen-2 (C1G2) standard because they use hash functions or other complex encryption schemes. Among other requirements, synchronization of keys, simultaneity, dependence, detecting illegitimate tags, eliminating unwanted tag processing, and denial-of-proof attacks have not been fully addressed by many. Our protocol addresses these important gaps by taking a holistic approach to grouping proofs and provides forward security, which is an open research issue. The protocol is based on simple (XOR) encryption and 128-bit pseudorandom number generators, operations that can be easily implemented on low-cost passive tags. Thus, our protocol enables large-scale implementations and achieves EPC C1G2 compliance while meeting the security requirements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

 Network coding has shown the promise of significant throughput improvement. In this paper, we study the network throughput using network coding and explore how the maximum throughput can be achieved in a two-way relay wireless network. Unlike previous studies, we consider a more general network with arbitrary structure of overhearing status between receivers and transmitters. To efficiently utilize the coding opportunities, we invent the concept of network coding cliques (NCCs), upon which a formal analysis on the network throughput using network coding is elaborated. In particular, we derive the closed-form expression of the network throughput under certain traffic load in a slotted ALOHA network with basic medium access control. Furthermore, the maximum throughput as well as optimal medium access probability at each node is studied under various network settings. Our theoretical findings have been validated by simulation as well.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many services and applications in vehicular ad-hoc networks (VANETs) require preserving and secure data communications. To improve driving safety and comfort, the traffic-related status information will be broadcasted regularly and shared among drivers. Without the security and privacy guarantees, attackers could track their interested vehicles by collecting and analyzing their traffic messages. Hence, anonymous message authentication is an essential requirement of VANETs. On the other hand, when a vehicle is involved in a dispute event of warning message, the certificate authority should be able to recover the real identity of this vehicle. To deal with this issue, we propose a new privacy-preserving authentication protocol with authority traceability using elliptic curve based chameleon hashing. Compared with existing schemes, our approach possesses the following features: 1) mutual and anonymous authentication for both vehicle-to-vehicle and vehicle-to-roadside communications, 2) vehicle unlinkability, 3) authority tracking capability, and 4) high computational efficiency. We also demonstrate the merits of our proposed scheme through security analysis and extensive performance evaluation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Attribute-based signature (ABS) enables users to sign messages over attributes without revealing any information other than the fact that they have attested to the messages. However, heavy computational cost is required during signing in existing work of ABS, which grows linearly with the size of the predicate formula. As a result, this presents a significant challenge for resource-constrained devices (such as mobile devices or RFID tags) to perform such heavy computations independently. Aiming at tackling the challenge above, we first propose and formalize a new paradigm called Outsourced ABS, i.e., OABS, in which the computational overhead at user side is greatly reduced through outsourcing intensive computations to an untrusted signing-cloud service provider (S-CSP). Furthermore, we apply this novel paradigm to existing ABS schemes to reduce the complexity. As a result, we present two concrete OABS schemes: i) in the first OABS scheme, the number of exponentiations involving in signing is reduced from O(d) to O(1) (nearly three), where d is the upper bound of threshold value defined in the predicate; ii) our second scheme is built on Herranz et al.'s construction with constant-size signatures. The number of exponentiations in signing is reduced from O(d2) to O(d) and the communication overhead is O(1). Security analysis demonstrates that both OABS schemes are secure in terms of the unforgeability and attribute-signer privacy definitions specified in the proposed security model. Finally, to allow for high efficiency and flexibility, we discuss extensions of OABS and show how to achieve accountability as well.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Statistics-based Internet traffic classification using machine learning techniques has attracted extensive research interest lately, because of the increasing ineffectiveness of traditional port-based and payload-based approaches. In particular, unsupervised learning, that is, traffic clustering, is very important in real-life applications, where labeled training data are difficult to obtain and new patterns keep emerging. Although previous studies have applied some classic clustering algorithms such as K-Means and EM for the task, the quality of resultant traffic clusters was far from satisfactory. In order to improve the accuracy of traffic clustering, we propose a constrained clustering scheme that makes decisions with consideration of some background information in addition to the observed traffic statistics. Specifically, we make use of equivalence set constraints indicating that particular sets of flows are using the same application layer protocols, which can be efficiently inferred from packet headers according to the background knowledge of TCP/IP networking. We model the observed data and constraints using Gaussian mixture density and adapt an approximate algorithm for the maximum likelihood estimation of model parameters. Moreover, we study the effects of unsupervised feature discretization on traffic clustering by using a fundamental binning method. A number of real-world Internet traffic traces have been used in our evaluation, and the results show that the proposed approach not only improves the quality of traffic clusters in terms of overall accuracy and per-class metrics, but also speeds up the convergence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Restraining the spread of rumors in online social networks (OSNs) has long been an important but difficult problem to be addressed. Currently, there are mainly two types of methods 1) blocking rumors at the most influential users or community bridges, or 2) spreading truths to clarify the rumors. Each method claims the better performance among all the others according to their own considerations and environments. However, there must be one standing out of the rest. In this paper, we focus on this part of work. The difficulty is that there does not exist a universal standard to evaluate them. In order to address this problem, we carry out a series of empirical and theoretical analysis on the basis of the introduced mathematical model. Based on this mathematical platform, each method will be evaluated by using real OSN data.We have done three types of analysis in this work. First, we compare all the measures of locating important users. The results suggest that the degree and betweenness measures outperform all the others in the Facebook network. Second, we analyze the method of the truth clarification method, and find that this method has a long-term performance while the degree measure performs well only in the early stage. Third, in order to leverage these two methods, we further explore the strategy of different methods working together and their equivalence. Given a fixed budget in the real world, our analysis provides a potential solution to find out a better strategy by integrating both types of methods together. From both the academic and technical perspective, the work in this paper is an important step towards the most practical and optimal strategies of restraining rumors in OSNs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multicast is an important mechanism in modern wireless networks and has attracted significant efforts to improve its performance with different metrics including throughput, delay, energy efficiency, etc. Traditionally, an ideal loss-free channel model is widely used to facilitate routing protocol design. However, the quality of wireless links is affected or even jeopardized resulting in transmission failures by many factors like collisions, fading or the noise of environment. In this paper, we propose a reliable multicast protocol, called CodePipe, with energy-efficiency, high throughput and fairness in lossy wireless networks. Building upon opportunistic routing and random linear network coding, CodePipe can not only eliminate coordination between nodes, but also improve the multicast throughput significantly by exploiting both intra-batch and inter-batch coding opportunities. In particular, four key techniques, namely, LP-based opportunistic routing structure, opportunistic feeding, fast batch moving and inter-batch coding, are proposed to offer significant improvement in throughput, energy-efficiency and fairness.Moreover, we design an efficient online extension of CodePipe such that it can work in a dynamic network where nodes join and leave the network as time progresses. We evaluate CodePipe on ns2 simulator by comparing with other two state-of-art multicast protocols,MORE and Pacifier. Simulation results show that CodePipe significantly outperforms both of them.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cloud is becoming a dominant computing platform. Naturally, a question that arises is whether we can beat notorious DDoS attacks in a cloud environment. Researchers have demonstrated that the essential issue of DDoS attack and defense is resource competition between defenders and attackers. A cloud usually possesses profound resources and has full control and dynamic allocation capability of its resources. Therefore, cloud offers us the potential to overcome DDoS attacks. However, individual cloud hosted servers are still vulnerable to DDoS attacks if they still run in the traditional way. In this paper, we propose a dynamic resource allocation strategy to counter DDoS attacks against individual cloud customers. When a DDoS attack occurs, we employ the idle resources of the cloud to clone sufficient intrusion prevention servers for the victim in order to quickly filter out attack packets and guarantee the quality of the service for benign users simultaneously. We establish a mathematical model to approximate the needs of our resource investment based on queueing theory. Through careful system analysis and real-world data set experiments, we conclude that we can defeat DDoS attacks in a cloud environment. © 2013 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Radio Frequency Identification (RFID) is a technology that has been deployed successfully for asset tracking within hospitals aimed at improving the quality of processes. In the Australian hospitals context however, adoption of this technology seem sporadic. This research reports on a long-term investigation to gain a deeper understanding of the socio-technical factors involved in the adoption of RFID in Australian hospitals. The research was conducted using interpretive multiple case methodology and results analyzed through the Actor-Network Theoretical (ANT) Lens. © 2013 Infonomics Society.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This brief proposes an efficient technique for the construction of optimized prediction intervals (PIs) by using the bootstrap technique. The method employs an innovative PI-based cost function in the training of neural networks (NNs) used for estimation of the target variance in the bootstrap method. An optimization algorithm is developed for minimization of the cost function and adjustment of NN parameters. The performance of the optimized bootstrap method is examined for seven synthetic and real-world case studies. It is shown that application of the proposed method improves the quality of constructed PIs by more than 28% over the existing technique, leading to narrower PIs with a coverage probability greater than the nominal confidence level.