196 resultados para Biometric authentication
Resumo:
In this paper we tackle the problem of finding an efficient signature verification scheme when the number of signatures is signi.- cantly large and the verifier is relatively weak. In particular, we tackle the problem of message authentication in many-to-one communication networks known as concast communication. The paper presents three signature screening algorithms for a variant of ElGamal-type digital signatures. The cost for these schemes is n applications of hash functions, 2n modular multiplications, and n modular additions plus the verification of one digital signature, where n is the number of signatures. The paper also presents a solution to the open problem of finding a fast screening signature for non-RSA digital signature schemes.
Resumo:
A parallel authentication and public-key encryption is introduced and exemplified on joint encryption and signing which compares favorably with sequential Encrypt-then-Sign (ɛtS) or Sign-then-Encrypt (Stɛ) schemes as far as both efficiency and security are concerned. A security model for signcryption, and thus joint encryption and signing, has been recently defined which considers possible attacks and security goals. Such a scheme is considered secure if the encryption part guarantees indistinguishability and the signature part prevents existential forgeries, for outsider but also insider adversaries. We propose two schemes of parallel signcryption, which are efficient alternative to Commit-then-Sign-and- Encrypt (Ct&G3&S). They are both provably secure in the random oracle model. The first one, called generic parallel encrypt and sign, is secure if the encryption scheme is semantically secure against chosen-ciphertext attacks and the signature scheme prevents existential forgeries against random-message attacks. The second scheme, called optimal parallel encrypt. and sign, applies random oracles similar to the OAEP technique in order to achieve security using encryption and signature components with very weak security requirements — encryption is expected to be one-way under chosen-plaintext attacks while signature needs to be secure against universal forgeries under random-plaintext attack, that is actually the case for both the plain-RSA encryption and signature under the usual RSA assumption. Both proposals are generic in the sense that any suitable encryption and signature schemes (i.e. which simply achieve required security) can be used. Furthermore they allow both parallel encryption and signing, as well as parallel decryption and verification. Properties of parallel encrypt and sign schemes are considered and a new security standard for parallel signcryption is proposed.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and inject data packets into the communication channel. In this context, bandwidth limitation and fast authentication are the core concerns. Therefore any authentication scheme is to reduce as much as possible the packet overhead and the time spent at the receiver to check the authenticity of collected elements. Recently, Tartary and Wang developed a provably secure protocol with small packet overhead and a reduced number of signature verifications to be performed at the receiver. In this paper, we propose an hybrid scheme based on Tartary and Wang’s approach and Merkle hash trees. Our construction will exhibit a smaller overhead and a much faster processing at the receiver making it even more suitable for multicast than the earlier approach. As Tartary and Wang’s protocol, our construction is provably secure and allows the total recovery of the data stream despite erasures and injections occurred during transmission.
Resumo:
Secure protocols for password-based user authentication are well-studied in the cryptographic literature but have failed to see wide-spread adoption on the Internet; most proposals to date require extensive modifications to the Transport Layer Security (TLS) protocol, making deployment challenging. Recently, a few modular designs have been proposed in which a cryptographically secure password-based mutual authentication protocol is run inside a confidential (but not necessarily authenticated) channel such as TLS; the password protocol is bound to the established channel to prevent active attacks. Such protocols are useful in practice for a variety of reasons: security no longer relies on users' ability to validate server certificates and can potentially be implemented with no modifications to the secure channel protocol library. We provide a systematic study of such authentication protocols. Building on recent advances in modelling TLS, we give a formal definition of the intended security goal, which we call password-authenticated and confidential channel establishment (PACCE). We show generically that combining a secure channel protocol, such as TLS, with a password authentication protocol, where the two protocols are bound together using either the transcript of the secure channel's handshake or the server's certificate, results in a secure PACCE protocol. Our prototype based on TLS is available as a cross-platform client-side Firefox browser extension and a server-side web application which can easily be installed on deployed web browsers and servers.
Resumo:
In this paper, the security of two recent RFID mutual authentication protocols are investigated. The first protocol is a scheme proposed by Huang et al. [7] and the second one by Huang, Lin and Li [6]. We show that these two protocols have several weaknesses. In Huang et al.’s scheme, an adversary can determine the 32-bit secret password with a probability of 2−2 , and in Huang-Lin-Li scheme, a passive adversary can recognize a target tag with a success probability of 1−2−4 and an active adversary can determine all 32 bits of Access password with success probability of 2−4 . The computational complexity of these attacks is negligible.
Resumo:
Integration of biometrics is considered as an attractive solution for the issues associated with password based human authentication as well as for secure storage and release of cryptographic keys which is one of the critical issues associated with modern cryptography. However, the widespread popularity of bio-cryptographic solutions are somewhat restricted by the fuzziness associated with biometric measurements. Therefore, error control mechanisms must be adopted to make sure that fuzziness of biometric inputs can be sufficiently countered. In this paper, we have outlined such existing techniques used in bio-cryptography while explaining how they are deployed in different types of solutions. Finally, we have elaborated on the important facts to be considered when choosing appropriate error correction mechanisms for a particular biometric based solution.
Resumo:
Supervisory Control and Data Acquisition (SCADA) systems are one of the key foundations of smart grids. The Distributed Network Protocol version 3 (DNP3) is a standard SCADA protocol designed to facilitate communications in substations and smart grid nodes. The protocol is embedded with a security mechanism called Secure Authentication (DNP3-SA). This mechanism ensures that end-to-end communication security is provided in substations. This paper presents a formal model for the behavioural analysis of DNP3-SA using Coloured Petri Nets (CPN). Our DNP3-SA CPN model is capable of testing and verifying various attack scenarios: modification, replay and spoofing, combined complex attack and mitigation strategies. Using the model has revealed a previously unidentified flaw in the DNP3-SA protocol that can be exploited by an attacker that has access to the network interconnecting DNP3 devices. An attacker can launch a successful attack on an outstation without possessing the pre-shared keys by replaying a previously authenticated command with arbitrary parameters. We propose an update to the DNP3-SA protocol that removes the flaw and prevents such attacks. The update is validated and verified using our CPN model proving the effectiveness of the model and importance of the formal protocol analysis.
Resumo:
BACKGROUND Tilted disc syndrome (TDS) is associated with characteristic ocular findings. The purpose of this study was to evaluate the ocular, refractive, and biometric characteristics in patients with TDS. METHODS This case-control study included 41 eyes of 25 patients who had established TDS and 40 eyes of 20 healthy control subjects. All participants underwent a complete ocular examination, including refraction and analysis using Fourier transformation, slit lamp biomicroscopy, pachymetry, keratometry, and ocular biometry. Corneal topography examinations were performed in the syndrome group only. RESULTS There were no significant differences in spherical equivalent (P = 0.13) and total astigmatism (P = 0.37) between groups. However, mean best spectacle-corrected visual acuity (Log Mar) was significantly worse in TDS patients (P = 0.003). The lenticular astigmatism was greater in the syndrome group, whereas the corneal component was greater in controls (P = 0.059 and P = 0.028, respectively). The measured biometric features were the same in both groups, except for the lens thickness and lens-axial length factor, which were greater in the TDS group (P = 0.007 and P = 0.055, respectively). CONCLUSIONS Clinically significant lenticular astigmatism, more oblique corneal astigmatism, and thicker lenses were characteristic findings in patients with TDS.
Resumo:
Purpose: To evaluate the ocular refractive and biometric characteristics in patients with tilted disc syndrome (TDS). Methods: This case-control study comprised 41 eyes of 25 patients with established TDS and forty eyes of 20 age- and sex-matched healthy control subjects. All had a complete ocular examination including refraction and analysis using Fourier transformation, slit lamp biomicroscopy, pachymetry keratometry, and ocular biometry. Corneal topography examinations were performed in the syndrome group only. Results: There were no significant differences in spherical equivalent (p = 0.334) and total astigmatism (p= 0.246) between groups. However, mean best spectacular corrected visual acuity was significantly worse in TDS patients (P < 0.001). The lenticular astigmatism was significantly greater in the syndrome group, while the corneal component was greater in the controls (p = 0.004 and p = 0.002, respectively). The measured biometric features were the same in both groups, except for the lens thickness, relative lens position, and lens-axial length factor which were greater in the TDS group (p = 0.002, p = 0.015, and p = 0.025, respectively). Conclusions: Clinically significant lenticular astigmatism, more oblique corneal astigmatism, and thicker lens were characteristic findings in patients with TDS.
Resumo:
It is not uncommon to hear a person of interest described by their height, build, and clothing (i.e. type and colour). These semantic descriptions are commonly used by people to describe others, as they are quick to communicate and easy to understand. However such queries are not easily utilised within intelligent video surveillance systems, as they are difficult to transform into a representation that can be utilised by computer vision algorithms. In this paper we propose a novel approach that transforms such a semantic query into an avatar in the form of a channel representation that is searchable within a video stream. We show how spatial, colour and prior information (person shape) can be incorporated into the channel representation to locate a target using a particle-filter like approach. We demonstrate state-of-the-art performance for locating a subject in video based on a description, achieving a relative performance improvement of 46.7% over the baseline. We also apply this approach to person re-detection, and show that the approach can be used to re-detect a person in a video steam without the use of person detection.
Resumo:
PURPOSE To estimate refractive indices used by the Lenstar biometer to translate measured optical path lengths into geometrical path lengths within the eye. METHODS Axial lengths of model eyes were determined using the IOLMaster and Lenstar biometers; comparing those lengths gave an overall eye refractive index estimate for the Lenstar. Using the Lenstar Graphical User Interface, we noticed that boundaries between media could be manipulated and opposite changes in optical path lengths on either side of the boundary could be introduced. Those ratios were combined with the overall eye refractive index to estimate separate refractive indices. Furthermore, Haag-Streit provided us with a template to obtain 'air thicknesses' to compare with geometrical distances. RESULTS The axial length estimates obtained using the IOLMaster and the Lenstar agreed to within 0.01 mm. Estimates of group refractive indices used in the Lenstar were 1.340, 1.341, 1.415, and 1.354 for cornea, aqueous, lens, and overall eye, respectively. Those refractive indices did not match those of schematic eyes, but were close in the cases of aqueous and lens. Linear equations relating air thicknesses to geometrical thicknesses were consistent with our findings. CONCLUSION The Lenstar uses different refractive indices for different ocular media. Some of the refractive indices, such as that for the cornea, are not physiological; therefore, it is likely that the calibrations in the instrument correspond to instrument-specific corrections and are not the real optical path lengths.
Resumo:
RFID is an important technology that can be used to create the ubiquitous society. But an RFID system uses open radio frequency signal to transfer information and this leads to pose many serious threats to its privacy and security. In general, the computing and storage resources in an RFID tag are very limited and this makes it difficult to solve its secure and private problems, especially for low-cost RFID tags. In order to ensure the security and privacy of low-cost RFID systems we propose a lightweight authentication protocol based on Hash function. This protocol can ensure forward security and prevent information leakage, location tracing, eavesdropping, replay attack and spoofing. This protocol completes the strong authentication of the reader to the tag by twice authenticating and it only transfers part information of the encrypted tag’s identifier for each session so it is difficult for an adversary to intercept the whole identifier of a tag. This protocol is simple and it takes less computing and storage resources, it is very suitable to some low-cost RFID systems.
Resumo:
The richness of the iris texture and its variability across individuals make it a useful biometric trait for personal authentication. One of the key stages in classical iris recognition is the normalization process, where the annular iris region is mapped to a dimensionless pseudo-polar coordinate system. This process results in a rectangular structure that can be used to compensate for differences in scale and variations in pupil size. Most iris recognition methods in the literature adopt linear sampling in the radial and angular directions when performing iris normalization. In this paper, a biomechanical model of the iris is used to define a novel nonlinear normalization scheme that improves iris recognition accuracy under different degrees of pupil dilation. The proposed biomechanical model is used to predict the radial displacement of any point in the iris at a given dilation level, and this information is incorporated in the normalization process. Experimental results on the WVU pupil light reflex database (WVU-PLR) indicate the efficacy of the proposed technique, especially when matching iris images with large differences in pupil size.
Resumo:
Digital signatures are often used by trusted authorities to make unique bindings between a subject and a digital object; for example, certificate authorities certify a public key belongs to a domain name, and time-stamping authorities certify that a certain piece of information existed at a certain time. Traditional digital signature schemes however impose no uniqueness conditions, so a trusted authority could make multiple certifications for the same subject but different objects, be it intentionally, by accident, or following a (legal or illegal) coercion. We propose the notion of a double-authentication-preventing signature, in which a value to be signed is split into two parts: a subject and a message. If a signer ever signs two different messages for the same subject, enough information is revealed to allow anyone to compute valid signatures on behalf of the signer. This double-signature forgeability property discourages signers from misbehaving—a form of self-enforcement—and would give binding authorities like CAs some cryptographic arguments to resist legal coercion. We give a generic construction using a new type of trapdoor functions with extractability properties, which we show can be instantiated using the group of sign-agnostic quadratic residues modulo a Blum integer; we show an additional application of these new extractable trapdoor functions to standard digital signatures.