724 resultados para Security, usability, digital signature
Resumo:
This book constitutes the refereed proceedings of the 11th International Conference on Cryptology and Network Security, CANS 2012, held in Darmstadt, Germany, in December 2012. The 22 revised full papers, presented were carefully reviewed and selected from 99 submissions. The papers are organized in topical sections on cryptanalysis; network security; cryptographic protocols; encryption; and s-box theory.
Resumo:
The Australasian Information Security Conference (AISC) 2011 was held on 18th-19th January 2011 in Perth, Australia, as a part of the Australasian Computer Science Week 2011. AISC grew out of the Australasian Information Security Workshop and officially changed the name to Australasian Information Security Conference in 2008. The main aim of the AISC is to provide a venue for Australasian and other researchers to present their work on all aspects of information security and promote collaboration between academic and industrial researchers working in this area.
Resumo:
The Australasian Information Security Conference (AISC) 2012 was held at RMIT University in Melbourne, Australia, as a part of the Australasian Computer Science Week, January 30 - February 3, 2012. AISC grew out of the Australasian Information Security Workshop and officially changed the name to Australasian Information Security Conference in 2008. The main aim of the AISC is to provide a venue for researchers to present their work on all aspects of information security and promote collaboration between academic and industrial researchers working in this area.
Resumo:
To harness safe operation of Web-based systems in Web environments, we propose an SSPA (Server-based SHA-1 Page-digest Algorithm) to verify the integrity of Web contents before the server issues an HTTP response to a user request. In addition to standard security measures, our Java implementation of the SSPA, which is called the Dynamic Security Surveillance Agent (DSSA), provides further security in terms of content integrity to Web-based systems. Its function is to prevent the display of Web contents that have been altered through the malicious acts of attackers and intruders on client machines. This is to protect the reputation of organisations from cyber-attacks and to ensure the safe operation of Web systems by dynamically monitoring the integrity of a Web site's content on demand. We discuss our findings in terms of the applicability and practicality of the proposed system. We also discuss its time metrics, specifically in relation to its computational overhead at the Web server, as well as the overall latency from the clients' point of view, using different Internet access methods. The SSPA, our DSSA implementation, some experimental results and related work are all discussed
Resumo:
One-time proxy signatures are one-time signatures for which a primary signer can delegate his or her signing capability to a proxy signer. In this work we propose two one-time proxy signature schemes with different security properties. Unlike other existing one-time proxy signatures that are constructed from public key cryptography, our proposed schemes are based one-way functions without trapdoors and so they inherit the communication and computation efficiency from the traditional one-time signatures. Although from a verifier point of view, signatures generated by the proxy are indistinguishable from those created by the primary signer, a trusted authority can be equipped with an algorithm that allows the authority to settle disputes between the signers. In our constructions, we use a combination of one-time signatures, oblivious transfer protocols and certain combinatorial objects. We characterise these new combinatorial objects and present constructions for them.
Resumo:
A parallel authentication and public-key encryption is introduced and exemplified on joint encryption and signing which compares favorably with sequential Encrypt-then-Sign (ɛtS) or Sign-then-Encrypt (Stɛ) schemes as far as both efficiency and security are concerned. A security model for signcryption, and thus joint encryption and signing, has been recently defined which considers possible attacks and security goals. Such a scheme is considered secure if the encryption part guarantees indistinguishability and the signature part prevents existential forgeries, for outsider but also insider adversaries. We propose two schemes of parallel signcryption, which are efficient alternative to Commit-then-Sign-and- Encrypt (Ct&G3&S). They are both provably secure in the random oracle model. The first one, called generic parallel encrypt and sign, is secure if the encryption scheme is semantically secure against chosen-ciphertext attacks and the signature scheme prevents existential forgeries against random-message attacks. The second scheme, called optimal parallel encrypt. and sign, applies random oracles similar to the OAEP technique in order to achieve security using encryption and signature components with very weak security requirements — encryption is expected to be one-way under chosen-plaintext attacks while signature needs to be secure against universal forgeries under random-plaintext attack, that is actually the case for both the plain-RSA encryption and signature under the usual RSA assumption. Both proposals are generic in the sense that any suitable encryption and signature schemes (i.e. which simply achieve required security) can be used. Furthermore they allow both parallel encryption and signing, as well as parallel decryption and verification. Properties of parallel encrypt and sign schemes are considered and a new security standard for parallel signcryption is proposed.
Resumo:
Standard signature schemes are usually designed only to achieve weak unforgeability – i.e. preventing forgery of signatures on new messages not previously signed. However, most signature schemes are randomised and allow many possible signatures for a single message. In this case, it may be possible to produce a new signature on a previously signed message. Some applications require that this type of forgery also be prevented – this requirement is called strong unforgeability. At PKC2006, Boneh Shen and Waters presented an efficient transform based on any randomised trapdoor hash function which converts a weakly unforgeable signature into a strongly unforgeable signature and applied it to construct a strongly unforgeable signature based on the CDH problem. However, the transform of Boneh et al only applies to a class of so-called partitioned signatures. Although many schemes fall in this class, some do not, for example the DSA signature. Hence it is natural to ask whether one can obtain a truly generic efficient transform based on any randomised trapdoor hash function which converts any weakly unforgeable signature into a strongly unforgeable one. We answer this question in the positive by presenting a simple modification of the Boneh-Shen-Waters transform. Our modified transform uses two randomised trapdoor hash functions.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and inject data packets into the communication channel. In this context, bandwidth limitation and fast authentication are the core concerns. Therefore any authentication scheme is to reduce as much as possible the packet overhead and the time spent at the receiver to check the authenticity of collected elements. Recently, Tartary and Wang developed a provably secure protocol with small packet overhead and a reduced number of signature verifications to be performed at the receiver. In this paper, we propose an hybrid scheme based on Tartary and Wang’s approach and Merkle hash trees. Our construction will exhibit a smaller overhead and a much faster processing at the receiver making it even more suitable for multicast than the earlier approach. As Tartary and Wang’s protocol, our construction is provably secure and allows the total recovery of the data stream despite erasures and injections occurred during transmission.
Resumo:
This paper describes research investigating expertise and the types of knowledge used by airport security screeners. It applies a multi method approach incorporating eye tracking, concurrent verbal protocol and interviews. Results show that novice and expert security screeners primarily access perceptual knowledge and experience little difficulty during routine situations. During non-routine situations however, experience was found to be a determining factor for effective interactions and problem solving. Experts were found to use strategic knowledge and demonstrated structured use of interface functions integrated into efficient problem solving sequences. Comparatively, novices experienced more knowledge limitations and uncertainty resulting in interaction breakdowns. These breakdowns were characterised by trial and error interaction sequences. This research suggests that the quality of knowledge security screeners have access to has implications on visual and physical interface interactions and their integration into problem solving sequences. Implications and recommendations for the design of interfaces used in the airport security screening context are discussed. The motivations of recommendations are to improve the integration of interactions into problem solving sequences, encourage development of problem scheme knowledge and to support the skills and knowledge of the personnel that interact with security screening systems.
Resumo:
Pseudorandom Generators (PRGs) based on the RSA inversion (one-wayness) problem have been extensively studied in the literature over the last 25 years. These generators have the attractive feature of provable pseudorandomness security assuming the hardness of the RSA inversion problem. However, despite extensive study, the most efficient provably secure RSA-based generators output asymptotically only at most O(logn) bits per multiply modulo an RSA modulus of bitlength n, and hence are too slow to be used in many practical applications. To bring theory closer to practice, we present a simple modification to the proof of security by Fischlin and Schnorr of an RSA-based PRG, which shows that one can obtain an RSA-based PRG which outputs Ω(n) bits per multiply and has provable pseudorandomness security assuming the hardness of a well-studied variant of the RSA inversion problem, where a constant fraction of the plaintext bits are given. Our result gives a positive answer to an open question posed by Gennaro (J. of Cryptology, 2005) regarding finding a PRG beating the rate O(logn) bits per multiply at the cost of a reasonable assumption on RSA inversion.
Resumo:
Firm-customer digital connectedness for effective sensing and responding is a strategic imperative for contemporary competitive firms. This research-in-progress paper conceptualizes and operationalizes the firm-customer mobile digital connectedness of a smart-mobile customer. The empirical investigation focuses on mobile app users and the impact of mobile apps on customer expectations. Based on pilot data collected from 127 customers, we tested hypotheses pertaining to firm-customer mobile digital connectedness and customer expectations. Our test analysis using linear and non-linear postulations reveals those customers raise their expectations as they increase their digital interactions with a firm.
Resumo:
The nature of services and service delivery has been changing rapidly since the 1980’s when many seminal papers in services research were published. Services are increasingly digital, or have a digital component. Further, a large and heterogeneous literature, with competing and overlapping definitions, many of which are dated and inappropriate to contemporary digital services offerings is impeding progress in digital services research. In this conceptual paper, we offer a critical review of some existing conceptualizations of services and digital services. We argue that an inductive approach to understanding cognition about digital services is required to develop a taxonomy of digital services and a new vocabulary. We argue that this is a pre-requisite to theorizing about digital services, including understanding quality drivers, value propositions, and quality determinants for different digital service types. We propose a research approach for reconceptualising digital services and service quality, and outline methodological approaches and outcomes.
Resumo:
This chapter addresses children’s development of digital media literacies with iPads in preschool settings. The authors argue that children living in post-industrial societies participate in ‘transmedia’ experiences that call for new understandings of media literacy that recognise children’s ability to successfully participate in complex media ecologies. The chapter outlines a model for digital media literacies that includes the application of digital materials and media concepts through the processes of media production and media analysis. This model is then used as a framework to interpret children’s media production work across the preschools in our project.
Resumo:
Much has been written about transferring class materials and teaching techniques to digital platforms, but less has been written about applying heuristic organizing constructs in the same manner. With the transformation of learning ecologies over the past decades as well as requirements to adjust to constantly shifting digital tools and environments, the challenges for learning facilitators are to readily adapt and change, as well as to engage a changing learner demographic. However, most importantly is to engage most effectively with learners in these online environments. This article reviews the existing literature in the heuristic construct of academagogy [1] and applies a case study methodology to discussion of the first application of academagogy to the online delivery of an undergraduate design unit. Through a focus on effective teaching and learning techniques, the transfer from face-to-face (f2f) to the digital realm is explored through four main focal points: Tools for teaching, teaching and learning, communicating with students, and effective teaching methods. These four focal points are then used to discuss ways to meet the challenges of teaching online including how they create new dimensions in teaching practice and how the digital experience changes learning experiences. The article concludes with reflection and consolidation of the similarities and differences between the face-to-face and digital deliveries, and by suggesting changes to the academagogic heuristic to enable its use more easily in a digital space.
Resumo:
Vehicular Ad-hoc Networks (VANETs) can make roads safer, cleaner, and smarter. It can offer a wide range of services, which can be safety and non-safety related. Many safety-related VANETs applications are real-time and mission critical, which would require strict guarantee of security and reliability. Even non-safety related multimedia applications, which will play an important role in the future, will require security support. Lack of such security and privacy in VANETs is one of the key hindrances to the wide spread implementations of it. An insecure and unreliable VANET can be more dangerous than the system without VANET support. So it is essential to make sure that “life-critical safety” information is secure enough to rely on. Securing the VANETs along with appropriate protection of the privacy drivers or vehicle owners is a very challenging task. In this work we summarize the attacks, corresponding security requirements and challenges in VANETs. We also present the most popular generic security policies which are based on prevention as well detection methods. Many VANETs applications require system-wide security support rather than individual layer from the VANETs’ protocol stack. In this work we will review the existing works in the perspective of holistic approach of security. Finally, we will provide some possible future directions to achieve system-wide security as well as privacy-friendly security in VANETs.