196 resultados para Biometric authentication
Resumo:
Medical industries have brought Information Technology (IT) in their systems for both patients and medical staffs due to the numerous benefits of IT we experience at presently. Moreover, the Mobile healthcare (M-health) system has been developed as the first step of Ubiquitous Health Environment (UHE). With the mobility and multi-functions, M-health system will be able to provide more efficient and various services for both doctors and patients. Due to the invisible feature of mobile signals, hackers have easier access to hospital networks than wired network systems. This may result in several security incidents unless security protocols are well implemented. In this paper, user authentication and authorization procedures will applied as a featured component at each level of M-health systems inthe hospital environment. Accordingly, M-health system in the hospital will meet the optimal requirements as a countermeasure to its vulnerabilities.
Resumo:
In a commercial environment, it is advantageous to know how long it takes customers to move between different regions, how long they spend in each region, and where they are likely to go as they move from one location to another. Presently, these measures can only be determined manually, or through the use of hardware tags (i.e. RFID). Soft biometrics are characteristics that can be used to describe, but not uniquely identify an individual. They include traits such as height, weight, gender, hair, skin and clothing colour. Unlike traditional biometrics, soft biometrics can be acquired by surveillance cameras at range without any user cooperation. While these traits cannot provide robust authentication, they can be used to provide identification at long range, and aid in object tracking and detection in disjoint camera networks. In this chapter we propose using colour, height and luggage soft biometrics to determine operational statistics relating to how people move through a space. A novel average soft biometric is used to locate people who look distinct, and these people are then detected at various locations within a disjoint camera network to gradually obtain operational statistics
Resumo:
We blend research from human-computer interface (HCI) design with computational based crypto- graphic provable security. We explore the notion of practice-oriented provable security (POPS), moving the focus to a higher level of abstraction (POPS+) for use in providing provable security for security ceremonies involving humans. In doing so we high- light some challenges and paradigm shifts required to achieve meaningful provable security for a protocol which includes a human. We move the focus of security ceremonies from being protocols in their context of use, to the protocols being cryptographic building blocks in a higher level protocol (the security cere- mony), which POPS can be applied to. In order to illustrate the need for our approach, we analyse both a protocol proven secure in theory, and a similar proto- col implemented by a �nancial institution, from both HCI and cryptographic perspectives.
Resumo:
Security of RFID authentication protocols has received considerable interest recently. However, an important aspect of such protocols that has not received as much attention is the efficiency of their communication. In this paper we investigate the efficiency benefits of pre-computation for time-constrained applications in small to medium RFID networks. We also outline a protocol utilizing this mechanism in order to demonstrate the benefits and drawbacks of using thisapproach. The proposed protocol shows promising results as it is able to offer the security of untraceableprotocols whilst only requiring the time comparable to that of more efficient but traceable protocols.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
Reliability of the performance of biometric identity verification systems remains a significant challenge. Individual biometric samples of the same person (identity class) are not identical at each presentation and performance degradation arises from intra-class variability and inter-class similarity. These limitations lead to false accepts and false rejects that are dependent. It is therefore difficult to reduce the rate of one type of error without increasing the other. The focus of this dissertation is to investigate a method based on classifier fusion techniques to better control the trade-off between the verification errors using text-dependent speaker verification as the test platform. A sequential classifier fusion architecture that integrates multi-instance and multisample fusion schemes is proposed. This fusion method enables a controlled trade-off between false alarms and false rejects. For statistically independent classifier decisions, analytical expressions for each type of verification error are derived using base classifier performances. As this assumption may not be always valid, these expressions are modified to incorporate the correlation between statistically dependent decisions from clients and impostors. The architecture is empirically evaluated by applying the proposed architecture for text dependent speaker verification using the Hidden Markov Model based digit dependent speaker models in each stage with multiple attempts for each digit utterance. The trade-off between the verification errors is controlled using the parameters, number of decision stages (instances) and the number of attempts at each decision stage (samples), fine-tuned on evaluation/tune set. The statistical validation of the derived expressions for error estimates is evaluated on test data. The performance of the sequential method is further demonstrated to depend on the order of the combination of digits (instances) and the nature of repetitive attempts (samples). The false rejection and false acceptance rates for proposed fusion are estimated using the base classifier performances, the variance in correlation between classifier decisions and the sequence of classifiers with favourable dependence selected using the 'Sequential Error Ratio' criteria. The error rates are better estimated by incorporating user-dependent (such as speaker-dependent thresholds and speaker-specific digit combinations) and class-dependent (such as clientimpostor dependent favourable combinations and class-error based threshold estimation) information. The proposed architecture is desirable in most of the speaker verification applications such as remote authentication, telephone and internet shopping applications. The tuning of parameters - the number of instances and samples - serve both the security and user convenience requirements of speaker-specific verification. The architecture investigated here is applicable to verification using other biometric modalities such as handwriting, fingerprints and key strokes.
Resumo:
We propose a new kind of asymmetric mutual authentication from passwords with stronger privacy against malicious servers, lest they be tempted to engage in “cross-site user impersonation” to each other. It enables a person to authenticate (with) arbitrarily many independent servers, over adversarial channels, using a memorable and reusable single short password. Beside the usual PAKE security guarantees, our framework goes to lengths to secure the password against brute-force cracking from privileged server information.
Resumo:
We propose to use a simple and effective way to achieve secure quantum direct secret sharing. The proposed scheme uses the properties of fountain codes to allow a realization of the physical conditions necessary for the implementation of no-cloning principle for eavesdropping-check and authentication. In our scheme, to achieve a variety of security purposes, nonorthogonal state particles are inserted in the transmitted sequence carrying the secret shares to disorder it. However, the positions of the inserted nonorthogonal state particles are not announced directly, but are obtained by sending degrees and positions of a sequence that are pre-shared between Alice and each Bob. Moreover, they can confirm that whether there exists an eavesdropper without exchanging classical messages. Most importantly, without knowing the positions of the inserted nonorthogonal state particles and the sequence constituted by the first particles from every EPR pair, the proposed scheme is shown to be secure.
Resumo:
At NDSS 2012, Yan et al. analyzed the security of several challenge-response type user authentication protocols against passive observers, and proposed a generic counting based statistical attack to recover the secret of some counting based protocols given a number of observed authentication sessions. Roughly speaking, the attack is based on the fact that secret (pass) objects appear in challenges with a different probability from non-secret (decoy) objects when the responses are taken into account. Although they mentioned that a protocol susceptible to this attack should minimize this difference, they did not give details as to how this can be achieved barring a few suggestions. In this paper, we attempt to fill this gap by generalizing the attack with a much more comprehensive theoretical analysis. Our treatment is more quantitative which enables us to describe a method to theoretically estimate a lower bound on the number of sessions a protocol can be safely used against the attack. Our results include 1) two proposed fixes to make counting protocols practically safe against the attack at the cost of usability, 2) the observation that the attack can be used on non-counting based protocols too as long as challenge generation is contrived, 3) and two main design principles for user authentication protocols which can be considered as extensions of the principles from Yan et al. This detailed theoretical treatment can be used as a guideline during the design of counting based protocols to determine their susceptibility to this attack. The Foxtail protocol, one of the protocols analyzed by Yan et al., is used as a representative to illustrate our theoretical and experimental results.
Resumo:
This paper makes a formal security analysis of the current Australian e-passport implementation using model checking tools CASPER/CSP/FDR. We highlight security issues in the current implementation and identify new threats when an e-passport system is integrated with an automated processing system like SmartGate. The paper also provides a security analysis of the European Union (EU) proposal for Extended Access Control (EAC) that is intended to provide improved security in protecting biometric information of the e-passport bearer. The current e-passport specification fails to provide a list of adequate security goals that could be used for security evaluation. We fill this gap; we present a collection of security goals for evaluation of e-passport protocols. Our analysis confirms existing security weaknesses that were previously identified and shows that both the Australian e-passport implementation and the EU proposal fail to address many security and privacy aspects that are paramount in implementing a secure border control mechanism. ACM Classification C.2.2 (Communication/Networking and Information Technology – Network Protocols – Model Checking), D.2.4 (Software Engineering – Software/Program Verification – Formal Methods), D.4.6 (Operating Systems – Security and Privacy Protection – Authentication)
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and introduce data packets into the communication channel. In such a model, packet overhead and computing efficiency are two parameters to be taken into account when designing a multicast stream protocol. In this paper, we propose to use two families of erasure codes to deal with this problem, namely, rateless codes and maximum distance separable codes. Our constructions will have the following advantages. First, our packet overhead will be small. Second, the number of signature verifications to be performed at the receiver is O(1). Third, every receiver will be able to recover all the original data packets emitted by the sender despite losses and injection occurred during the transmission of information.
Resumo:
Recently a new human authentication scheme called PAS (predicate-based authentication service) was proposed, which does not require the assistance of any supplementary device. The main security claim of PAS is to resist passive adversaries who can observe the whole authentication session between the human user and the remote server. In this paper we show that PAS is insecure against both brute force attack and a probabilistic attack. In particular, we show that its security against brute force attack was strongly overestimated. Furthermore, we introduce a probabilistic attack, which can break part of the password even with a very small number of observed authentication sessions. Although the proposed attack cannot completely break the password, it can downgrade the PAS system to a much weaker system similar to common OTP (one-time password) systems.
Resumo:
The first generation e-passport standard is proven to be insecure and prone to various attacks. To strengthen, the European Union (EU) has proposed an Extended Access Control (EAC) mechanism for e-passports that intends to provide better security in protecting biometric information of the e-passport bearer. But, our analysis shows, the EU proposal fails to address many security and privacy issues that are paramount in implementing a strong security mechanism. In this paper we propose an on-line authentication mechanism for electronic passports that addresses the weakness in existing implementations, of both The International Civil Aviation Organisation (ICAO) and EU. Our proposal utilises ICAO PKI implementation, thus requiring very little modifications to the existing infrastructure which is already well established.
Resumo:
Digital signatures are often used by trusted authorities to make unique bindings between a subject and a digital object; for example, certificate authorities certify a public key belongs to a domain name, and time-stamping authorities certify that a certain piece of information existed at a certain time. Traditional digital signature schemes however impose no uniqueness conditions, so a trusted authority could make multiple certifications for the same subject but different objects, be it intentionally, by accident, or following a (legal or illegal) coercion. We propose the notion of a double-authentication-preventing signature, in which a value to be signed is split into two parts: a subject and a message. If a signer ever signs two different messages for the same subject, enough information is revealed to allow anyone to compute valid signatures on behalf of the signer. This double-signature forgeability property discourages signers from misbehaving---a form of self-enforcement---and would give binding authorities like CAs some cryptographic arguments to resist legal coercion. We give a generic construction using a new type of trapdoor functions with extractability properties, which we show can be instantiated using the group of sign-agnostic quadratic residues modulo a Blum integer.