48 resultados para Biometric authentication
Resumo:
Sixty samples of milk, Halloumi cheese and local grazing plants (i.e. shrubs) were collected over a year from dairy farms located on three different locations of Cyprus. Major and trace elements were quantified using inductively coupled plasma-atomic emission spectroscopy (ICP-AES). Milk and Halloumi cheese produced in different geographical locations presented significant differences in the concentration of some of the elements analysed. Principal component analysis showed grouping of samples according to the region of production for both milk and cheese samples. These findings show that the assay of elements can provide useful fingerprints for the characterisation of dairy products.
Resumo:
Although visual surveillance has emerged as an effective technolody for public security, privacy has become an issue of great concern in the transmission and distribution of surveillance videos. For example, personal facial images should not be browsed without permission. To cope with this issue, face image scrambling has emerged as a simple solution for privacyrelated applications. Consequently, online facial biometric verification needs to be carried out in the scrambled domain thus bringing a new challenge to face classification. In this paper, we investigate face verification issues in the scrambled domain and propose a novel scheme to handle this challenge. In our proposed method, to make feature extraction from scrambled face images robust, a biased random subspace sampling scheme is applied to construct fuzzy decision trees from randomly selected features, and fuzzy forest decision using fuzzy memberships is then obtained from combining all fuzzy tree decisions. In our experiment, we first estimated the optimal parameters for the construction of the random forest, and then applied the optimized model to the benchmark tests using three publically available face datasets. The experimental results validated that our proposed scheme can robustly cope with the challenging tests in the scrambled domain, and achieved an improved accuracy over all tests, making our method a promising candidate for the emerging privacy-related facial biometric applications.
Resumo:
This document describes the cryptographic hash function BLAKE2 and makes the algorithm specification and C source code conveniently available to the Internet community. BLAKE2 comes in two main flavors: BLAKE2b is optimized for 64-bit platforms and BLAKE2s for smaller architectures. BLAKE2 can be directly keyed, making it functionally equivalent to a Message Authentication Code (MAC).
Resumo:
A practically viable multi-biometric recognition system should not only be stable, robust and accurate but should also adhere to real-time processing speed and memory constraints. This study proposes a cascaded classifier-based framework for use in biometric recognition systems. The proposed framework utilises a set of weak classifiers to reduce the enrolled users' dataset to a small list of candidate users. This list is then used by a strong classifier set as the final stage of the cascade to formulate the decision. At each stage, the candidate list is generated by a Mahalanobis distance-based match score quality measure. One of the key features of the authors framework is that each classifier in the ensemble can be designed to use a different modality thus providing the advantages of a truly multimodal biometric recognition system. In addition, it is one of the first truly multimodal cascaded classifier-based approaches for biometric recognition. The performance of the proposed system is evaluated both for single and multimodalities to demonstrate the effectiveness of the approach.
Resumo:
In this paper, a novel and effective lip-based biometric identification approach with the Discrete Hidden Markov Model Kernel (DHMMK) is developed. Lips are described by shape features (both geometrical and sequential) on two different grid layouts: rectangular and polar. These features are then specifically modeled by a DHMMK, and learnt by a support vector machine classifier. Our experiments are carried out in a ten-fold cross validation fashion on three different datasets, GPDS-ULPGC Face Dataset, PIE Face Dataset and RaFD Face Dataset. Results show that our approach has achieved an average classification accuracy of 99.8%, 97.13%, and 98.10%, using only two training images per class, on these three datasets, respectively. Our comparative studies further show that the DHMMK achieved a 53% improvement against the baseline HMM approach. The comparative ROC curves also confirm the efficacy of the proposed lip contour based biometrics learned by DHMMK. We also show that the performance of linear and RBF SVM is comparable under the frame work of DHMMK.
Resumo:
Importance: This article provides, to our knowledge, the first longitudinal population-based data on refractive error (RE) in Chinese persons.
Objective: To study cohort effects and changes associated with aging in REs among Chinese adults.
Design, Setting, and Participants: A 2-year, longitudinal population-based cohort study was conducted in southern China. Participants, identified using cluster random sampling, included residents of Yuexiu District, Guangzhou, China, aged 35 years or older who had undergone no previous eye surgery.
Methods: Participants underwent noncycloplegic automated refraction and keratometry in December 2008 and December 2010; in a random 50% sample of the participants, anterior segment ocular coherence tomography measurement of lens thickness, as well as measurement of axial length and anterior chamber depth by partial coherence laser interferometry, were performed.
Main Outcomes and Measures: Two-year change in spherical equivalent refraction (RE), lens thickness, axial length, and anterior chamber depth in the right eye.
Results: A total of 745 individuals underwent biometric testing in both 2008 and 2010 (2008 mean [SD] age, 52.2 [11.5] years; 53.7% women). Mean RE showed a 2-year hyperopic shift from −0.44 (2.21) to −0.31 (2.26) diopters (D) (difference, +0.13; 95% CI, 0.11 to 0.16). A consistent 2-year hyperopic shift of 0.09 to 0.22 D was observed among participants aged 35 to 64 years when stratifying by decade, suggesting that a substantial change in RE with aging may occur during this 30-year period. Cross-sectionally, RE increased only in the cohort younger than 50 years (0.11 D/y; 95% CI, 0.06 to 0.16). In the cross-sectional data, axial length decreased at −0.06 mm/y (95% CI, −0.09 to −0.04), although the 2-year change in axial length was positive and thus could not explain the cross-sectional difference. These latter results suggest a cohort effect, with greater myopia developing among younger persons.
Conclusions and Relevance: This first Chinese population-based longitudinal study of RE provides evidence for both important longitudinal aging changes and cohort effects, most notably greater myopia prevalence among younger persons.
Resumo:
Purpose: To assess the repeatability and accuracy of optical biometry (Lenstar LS900 optical low-coherence reflectometry [OLCR] and IOLMaster partial coherence interferometry [PCI]) and applanation ultrasound biometry in highly myopic eyes. Setting: Division of Preventive Ophthalmology, Zhongshan Ophthalmic Center, Guangzhou, China. Design: Comparative evaluation of diagnostic technology. Methods: Biometric measurements were taken in highly myopic subjects with a spherical equivalent (SE) of -6.00 diopters (D) or higher and an axial length (AL) longer than 25.0 mm. Measurements of AL and anterior chamber depth (ACD) obtained by OLCR were compared with those obtained by PCI and applanation A-scan ultrasound. Right eyes were analyzed. Repeatability was evaluated using the coefficient of variation (CoV) and agreement, using Bland-Altman analyses. Results: The mean SE was -11.20 D ± 4.65 (SD). The CoVs for repeated AL measurements using OLCR, PCI, and applanation ultrasound were 0.06%, 0.07%, and 0.20%, respectively. The limits of agreement (LoA) for AL were 0.11 mm between OLCR and PCI, 1.01 mm between OLCR and applanation ultrasound, and 1.03 mm between PCI and ultrasound. The ACD values were 0.29 mm, 0.53 mm, and 0.51 mm, respectively. These repeatability and agreement results were comparable in eyes with extreme myopia (AL ≥27.0 mm) or posterior staphyloma. The mean radius of corneal curvature was similar between OLCR and PCI (7.66 ± 0.24 mm versus 7.64 ± 0.25 mm), with an LoA of 0.12 mm. Conclusion: Optical biometry provided more repeatable and precise measurements of biometric parameters, including AL and ACD, than applanation ultrasound biometry in highly myopic eyes. Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned. © 2012 ASCRS and ESCRS.
Resumo:
OBJECTIVE:
To design a system of gonioscopy that will allow greater interobserver reliability and more clearly defined screening cutoffs for angle closure than current systems while being simple to teach and technologically appropriate for use in rural Asia, where the prevalence of angle-closure glaucoma is highest.
DESIGN:
Clinic-based validation and interobserver reliability trial.
PARTICIPANTS:
Study 1: 21 patients 18 years of age and older recruited from a university-based specialty glaucoma clinic; study 2: 32 patients 18 years of age and older recruited from the same clinic.
INTERVENTION:
In study 1, all participants underwent conventional gonioscopy by an experienced observer (GLS) using the Spaeth system and in the same eye also underwent Scheimpflug photography, ultrasonographic measurement of anterior chamber depth and axial length, automatic refraction, and biometric gonioscopy with measurement of the distance from iris insertion to Schwalbe's line using a reticule based in the slit-lamp ocular. In study 2, all participants underwent both conventional gonioscopy and biometric gonioscopy by an experienced gonioscopist (NGC) and a medical student with no previous training in gonioscopy (JK).
MAIN OUTCOME MEASURES:
Study 1: The association between biometric gonioscopy and conventional gonioscopy, Scheimpflug photography, and other factors known to correlate with the configuration of the angle. Study 2: Interobserver agreement using biometric gonioscopy compared to that obtained with conventional gonioscopy.
RESULTS:
In study 1, there was an independent, monotonic, statistically significant relationship between biometric gonioscopy and both Spaeth angle (P = 0.001, t test) and Spaeth insertion (P = 0.008, t test) grades. Biometric gonioscopy correctly identified six of six patients with occludable angles according to Spaeth criteria. Biometric gonioscopic grade was also significantly associated with the anterior chamber angle as measured by Scheimpflug photography (P = 0.005, t test). In study 2, the intraclass correlation coefficient between graders for biometric gonioscopy (0.97) was higher than for Spaeth angle grade (0.72) or Spaeth insertion grade (0.84).
CONCLUSION:
Biometric gonioscopy correlates well with other measures of the anterior chamber angle, shows a higher degree of interobserver reliability than conventional gonioscopy, and can readily be learned by an inexperienced observer.
Resumo:
AIM:
To utilise a novel method for making measurements in the anterior chamber in order to compare the anterior chamber angles of people of European, African, and east Asian descent aged 40 years and over.
METHODS:
A cross sectional study on 15 people of each sex from each decade from the 40s to the 70s, from each of three racial groups-black, white, and Chinese Singaporeans. Biometric gonioscopy (BG) utilises a slit lamp mounted reticule to make measurements from the apparent iris insertion to Schwalbe's line through a Goldmann one mirror goniolens. The main outcome measures were BG measurements of the anterior chamber angle as detailed above.
RESULTS:
There was no significant difference in angle measurement between black, white, and Chinese races in this study. However, at younger ages people of Chinese race appeared to have deeper angles than white or black people, whereas the angles of older Chinese were significantly narrower (p = 0.004 for the difference in slope of BG by age between Chinese and both black and white people).
CONCLUSION:
The failure to detect a difference in angle measurements between these groups was surprising, given the much higher prevalence of angle closure among Chinese. It appears that the overall apparent similarity of BG means between Chinese and Western populations may mask very different trends with age. The apparently more rapid decline in angle width measurements with age among Chinese may be due to the higher prevalence of cataract or "creeping angle closure." However, longitudinal inferences from cross sectional data are problematic, and this may represent a cohort phenomenon caused by the increasing prevalence of myopia in the younger Singaporean population.
Resumo:
In order to protect user privacy on mobile devices, an event-driven implicit authentication scheme is proposed in this paper. Several methods of utilizing the scheme for recognizing legitimate user behavior are investigated. The investigated methods compute an aggregate score and a threshold in real-time to determine the trust level of the current user using real data derived from user interaction with the device. The proposed scheme is designed to: operate completely in the background, require minimal training period, enable high user recognition rate for implicit authentication, and prompt detection of abnormal activity that can be used to trigger explicitly authenticated access control. In this paper, we investigate threshold computation through standard deviation and EWMA (exponentially weighted moving average) based algorithms. The result of extensive experiments on user data collected over a period of several weeks from an Android phone indicates that our proposed approach is feasible and effective for lightweight real-time implicit authentication on mobile smartphones.
Resumo:
In the context of products from certain regions or countries being banned because of an identified or non-identified hazard, proof of geographical origin is essential with regard to feed and food safety issues. Usually, the product labeling of an affected feed lot shows origin, and the paper documentation shows traceability. Incorrect product labeling is common in embargo situations, however, and alternative analytical strategies for controlling feed authenticity are therefore needed. In this study, distillers' dried grains and solubles (DDGS) were chosen as the product on which to base a comparison of analytical strategies aimed at identifying the most appropriate one. Various analytical techniques were investigated for their ability to authenticate DDGS, including spectroscopic and spectrometric techniques combined with multivariate data analysis, as well as proven techniques for authenticating food, such as DNA analysis and stable isotope ratio analysis. An external validation procedure (called the system challenge) was used to analyze sample sets blind and to compare analytical techniques. All the techniques were adapted so as to be applicable to the DDGS matrix. They produced positive results in determining the botanical origin of DDGS (corn vs. wheat), and several of them were able to determine the geographical origin of the DDGS in the sample set. The maintenance and extension of the databanks generated in this study through the analysis of new authentic samples from a single location are essential in order to monitor developments and processing that could affect authentication.
Resumo:
In recent years, the adaptation of Wireless Sensor Networks (WSNs) to application areas requiring mobility increased the security threats against confidentiality, integrity and privacy of the information as well as against their connectivity. Since, key management plays an important role in securing both information and connectivity, a proper authentication and key management scheme is required in mobility enabled applications where the authentication of a node with the network is a critical issue. In this paper, we present an authentication and key management scheme supporting node mobility in a heterogeneous WSN that consists of several low capabilities sensor nodes and few high capabilities sensor nodes. We analyze our proposed solution by using MATLAB (analytically) and by simulation (OMNET++ simulator) to show that it has less memory requirement and has good network connectivity and resilience against attacks compared to some existing schemes. We also propose two levels of secure authentication methods for the mobile sensor nodes for secure authentication and key establishment.
Resumo:
In order to address the increasing compromise of user privacy on mobile devices, a Fuzzy Logic based implicit authentication scheme is proposed in this paper. The proposed scheme computes an aggregate score based on selected features and a threshold in real-time based on current and historic data depicting user routine. The tuned fuzzy system is then applied to the aggregated score and the threshold to determine the trust level of the current user. The proposed fuzzy-integrated implicit authentication scheme is designed to: operate adaptively and completely in the background, require minimal training period, enable high system accuracy while provide timely detection of abnormal activity. In this paper, we explore Fuzzy Logic based authentication in depth. Gaussian and triangle-based membership functions are investigated and compared using real data over several weeks from different Android phone users. The presented results show that our proposed Fuzzy Logic approach is a highly effective, and viable scheme for lightweight real-time implicit authentication on mobile devices.
Resumo:
A novel wireless local area network (WLAN) security processor is described in this paper. It is designed to offload security encapsulation processing from the host microprocessor in an IEEE 802.11i compliant medium access control layer to a programmable hardware accelerator. The unique design, which comprises dedicated cryptographic instructions and hardware coprocessors, is capable of performing wired equivalent privacy, temporal key integrity protocol, counter mode with cipher block chaining message authentication code protocol, and wireless robust authentication protocol. Existing solutions to wireless security have been implemented on hardware devices and target specific WLAN protocols whereas the programmable security processor proposed in this paper provides support for all WLAN protocols and thus, can offer backwards compatibility as well as future upgrade ability as standards evolve. It provides this additional functionality while still achieving equivalent throughput rates to existing architectures. © 2006 IEEE.
Resumo:
This paper investigates the problem of speaker identi-fication and verification in noisy conditions, assuming that speechsignals are corrupted by environmental noise, but knowledgeabout the noise characteristics is not available. This research ismotivated in part by the potential application of speaker recog-nition technologies on handheld devices or the Internet. Whilethe technologies promise an additional biometric layer of securityto protect the user, the practical implementation of such systemsfaces many challenges. One of these is environmental noise. Due tothe mobile nature of such systems, the noise sources can be highlytime-varying and potentially unknown. This raises the require-ment for noise robustness in the absence of information about thenoise. This paper describes a method that combines multicondi-tion model training and missing-feature theory to model noisewith unknown temporal-spectral characteristics. Multiconditiontraining is conducted using simulated noisy data with limitednoise variation, providing a “coarse” compensation for the noise,and missing-feature theory is applied to refine the compensationby ignoring noise variation outside the given training conditions,thereby reducing the training and testing mismatch. This paperis focused on several issues relating to the implementation of thenew model for real-world applications. These include the gener-ation of multicondition training data to model noisy speech, thecombination of different training data to optimize the recognitionperformance, and the reduction of the model’s complexity. Thenew algorithm was tested using two databases with simulated andrealistic noisy speech data. The first database is a redevelopmentof the TIMIT database by rerecording the data in the presence ofvarious noise types, used to test the model for speaker identifica-tion with a focus on the varieties of noise. The second database isa handheld-device database collected in realistic noisy conditions,used to further validate the model for real-world speaker verifica-tion. The new model is compared to baseline systems and is foundto achieve lower error rates.