939 resultados para Signature verification


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic signature verification is a well-established and an active area of research with numerous applications such as bank check verification, ATM access, etc. This paper proposes a novel approach to the problem of automatic off-line signature verification and forgery detection. The proposed approach is based on fuzzy modeling that employs the Takagi-Sugeno (TS) model. Signature verification and forgery detection are carried out using angle features extracted from box approach. Each feature corresponds to a fuzzy set. The features are fuzzified by an exponential membership function involved in the TS model, which is modified to include structural parameters. The structural parameters are devised to take account of possible variations due to handwriting styles and to reflect moods. The membership functions constitute weights in the TS model. The optimization of the output of the TS model with respect to the structural parameters yields the solution for the parameters. We have also derived two TS models by considering a rule for each input feature in the first formulation (Multiple rules) and by considering a single rule for all input features in the second formulation. In this work, we have found that TS model with multiple rules is better than TS model with single rule for detecting three types of forgeries; random, skilled and unskilled from a large database of sample signatures in addition to verifying genuine signatures. We have also devised three approaches, viz., an innovative approach and two intuitive approaches using the TS model with multiple rules for improved performance. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an innovative approach for signature verification and forgery detection based on fuzzy modeling. The signature image is binarized and resized to a fixed size window and is then thinned. The thinned image is then partitioned into a fixed number of eight sub-images called boxes. This partition is done using the horizontal density approximation approach. Each sub-image is then further resized and again partitioned into twelve further sub-images using the uniform partitioning approach. The features of consideration are normalized vector angle (α) from each box. Each feature extracted from sample signatures gives rise to a fuzzy set. Since the choice of a proper fuzzification function is crucial for verification, we have devised a new fuzzification function with structural parameters, which is able to adapt to the variations in fuzzy sets. This function is employed to develop a complete forgery detection and verification system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With wireless vehicular communications, Vehicular Ad Hoc Networks (VANETs) enable numerous applications to enhance traffic safety, traffic efficiency, and driving experience. However, VANETs also impose severe security and privacy challenges which need to be thoroughly investigated. In this dissertation, we enhance the security, privacy, and applications of VANETs, by 1) designing application-driven security and privacy solutions for VANETs, and 2) designing appealing VANET applications with proper security and privacy assurance. First, the security and privacy challenges of VANETs with most application significance are identified and thoroughly investigated. With both theoretical novelty and realistic considerations, these security and privacy schemes are especially appealing to VANETs. Specifically, multi-hop communications in VANETs suffer from packet dropping, packet tampering, and communication failures which have not been satisfyingly tackled in literature. Thus, a lightweight reliable and faithful data packet relaying framework (LEAPER) is proposed to ensure reliable and trustworthy multi-hop communications by enhancing the cooperation of neighboring nodes. Message verification, including both content and signature verification, generally is computation-extensive and incurs severe scalability issues to each node. The resource-aware message verification (RAMV) scheme is proposed to ensure resource-aware, secure, and application-friendly message verification in VANETs. On the other hand, to make VANETs acceptable to the privacy-sensitive users, the identity and location privacy of each node should be properly protected. To this end, a joint privacy and reputation assurance (JPRA) scheme is proposed to synergistically support privacy protection and reputation management by reconciling their inherent conflicting requirements. Besides, the privacy implications of short-time certificates are thoroughly investigated in a short-time certificates-based privacy protection (STCP2) scheme, to make privacy protection in VANETs feasible with short-time certificates. Secondly, three novel solutions, namely VANET-based ambient ad dissemination (VAAD), general-purpose automatic survey (GPAS), and VehicleView, are proposed to support the appealing value-added applications based on VANETs. These solutions all follow practical application models, and an incentive-centered architecture is proposed for each solution to balance the conflicting requirements of the involved entities. Besides, the critical security and privacy challenges of these applications are investigated and addressed with novel solutions. Thus, with proper security and privacy assurance, these solutions show great application significance and economic potentials to VANETs. Thus, by enhancing the security, privacy, and applications of VANETs, this dissertation fills the gap between the existing theoretic research and the realistic implementation of VANETs, facilitating the realistic deployment of VANETs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article evaluates an authentication technique for mobiles based on gestures. Users create a remindful identifying gesture to be considered as their in-air signature. This work analyzes a database of 120 gestures of different vulnerability, obtaining an Equal Error Rate (EER) of 9.19% when robustness of gestures is not verified. Most of the errors in this EER come from very simple and easily forgeable gestures that should be discarded at enrollment phase. Therefore, an in-air signature robustness verification system using Linear Discriminant Analysis is proposed to infer automatically whether the gesture is secure or not. Different configurations have been tested obtaining a lowest EER of 4.01% when 45.02% of gestures were discarded, and an optimal compromise of EER of 4.82% when 19.19% of gestures were automatically rejected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Fourier transform-infrared (FT-IR) signature of dry samples of DNA and DNA-polypeptide complexes, as studied by IR microspectroscopy using a diamond attenuated total reflection (ATR) objective, has revealed important discriminatory characteristics relative to the PO2(-) vibrational stretchings. However, DNA IR marks that provide information on the sample's richness in hydrogen bonds have not been resolved in the spectral profiles obtained with this objective. Here we investigated the performance of an all reflecting objective (ARO) for analysis of the FT-IR signal of hydrogen bonds in DNA samples differing in base richness types (salmon testis vs calf thymus). The results obtained using the ARO indicate prominent band peaks at the spectral region representative of the vibration of nitrogenous base hydrogen bonds and of NH and NH2 groups. The band areas at this spectral region differ in agreement with the DNA base richness type when using the ARO. A peak assigned to adenine was more evident in the AT-rich salmon DNA using either the ARO or the ATR objective. It is concluded that, for the discrimination of DNA IR hydrogen bond vibrations associated with varying base type proportions, the use of an ARO is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamical Chern-Simons gravity is an extension of general relativity in which the gravitational field is coupled to a scalar field through a parity-violating Chern-Simons term. In this framework, we study perturbations of spherically symmetric black hole spacetimes, assuming that the background scalar field vanishes. Our results suggest that these spacetimes are stable, and small perturbations die away as a ringdown. However, in contrast to standard general relativity, the gravitational waveforms are also driven by the scalar field. Thus, the gravitational oscillation modes of black holes carry imprints of the coupling to the scalar field. This is a smoking gun for Chern-Simons theory and could be tested with gravitational-wave detectors, such as LIGO or LISA. For negative values of the coupling constant, ghosts are known to arise, and we explicitly verify their appearance numerically. Our results are validated using both time evolution and frequency domain methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the effect of an interaction between dark energy and dark matter upon the dynamics of galaxy clusters. This effect is computed through the Layser-Irvine equation, which describes how an astrophysical system reaches virial equilibrium and was modified to include the dark interactions. Using observational data from almost 100 purportedly relaxed galaxy clusters we put constraints on the strength of the couplings in the dark sector. We compare our results with those from other observations and find that a positive (in the sense of energy flow from dark energy to dark matter) nonvanishing interaction is consistent with the data within several standard deviations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of Nb(3)Al and Nb(3)Sn superconductors is of great interest for the applied superconductivity area. These intermetallics composites are obtained normally by heat treatment reactions at high temperature. Processes that allow formation of the superconducting phases at lower temperatures (<1000 degrees C), particularly for Nb(3)Al, are of great interest. The present work studies phase formation and stability of Nb(3)Al and Nb(3)Sn superconducting phases using mechanical alloying (high energy ball milling). Our main objective was to form composites near stoichiometry, which could be transformed into the superconducting phases using low-temperature heat treatments. High purity Nb-Sn and Nb-Al powders were mixed to generate the required superconducting phases (Nb-25at.%Sn and Nb-25at.%Al) in an argon atmosphere glove-box. After milling in a Fritsch mill, the samples were compressed in a hydraulic uniaxial press and encapsulated in evacuated quartz tubes for heat treatment. The compressed and heat treated samples were characterized using X-ray diffractometry. Microstructure and chemical analysis were accomplished using scanning electron microscopy and energy dispersive spectrometry. Nb(3)Al XRD peaks were observed after the sintering at 800 degrees C for the sample milled for 30 h. Nb(3)Sn XRD peaks could be observed even before the heat treatment. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents results on a verification test of a Direct Numerical Simulation code of mixed high-order of accuracy using the method of manufactured solutions (MMS). This test is based on the formulation of an analytical solution for the Navier-Stokes equations modified by the addition of a source term. The present numerical code was aimed at simulating the temporal evolution of instability waves in a plane Poiseuille flow. The governing equations were solved in a vorticity-velocity formulation for a two-dimensional incompressible flow. The code employed two different numerical schemes. One used mixed high-order compact and non-compact finite-differences from fourth-order to sixth-order of accuracy. The other scheme used spectral methods instead of finite-difference methods for the streamwise direction, which was periodic. In the present test, particular attention was paid to the boundary conditions of the physical problem of interest. Indeed, the verification procedure using MMS can be more demanding than the often used comparison with Linear Stability Theory. That is particularly because in the latter test no attention is paid to the nonlinear terms. For the present verification test, it was possible to manufacture an analytical solution that reproduced some aspects of an instability wave in a nonlinear stage. Although the results of the verification by MMS for this mixed-order numerical scheme had to be interpreted with care, the test was very useful as it gave confidence that the code was free of programming errors. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chloride attack in marine environments or in structures where deicing salts are used will not always show profiles with concentrations that decrease from the external surface to the interior of the concrete. Some profiles show an increase in chloride concentrations from when a peak is formed. This type of profile must be analyzed in a different way from the traditional model of Fick`s second law to generate more precise service life models. A model for forecasting the penetration of chloride ions as a function of time for profiles having formed a peak. To confirm the efficiency of this model, it is necessary to observe the behavior of a chloride profile with peak in a specific structure over a period of time. To achieve this, two chloride profiles with different ages (22 and 27 years) were extracted from the same structure. The profile obtained from the 22-year sample was used to estimate the chloride profile at 27 years using three models: a) the traditional model using Fick`s second law and extrapolating the value of C(S)-external surface chloride concentration; b) the traditional model using Fick`s second law and shifting the x-axis to the peak depth; c) the previously proposed model. The results from these models were compared with the actual profile measured in the 27-year sample and the results were analyzed. The model was presented with good precision for this study of case, requiring to be tested with other structures in use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a one-time signature scheme based on the hardness of the syndrome decoding problem, and prove it secure in the random oracle model. Our proposal can be instantiated on general linear error correcting codes, rather than restricted families like alternant codes for which a decoding trapdoor is known to exist. (C) 2010 Elsevier Inc. All rights reserved,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large percentage of pile caps support only one column, and the pile caps in turn are supported by only a few piles. These are typically short and deep members with overall span-depth ratios of less than 1.5. Codes of practice do not provide uniform treatment for the design of these types of pile caps. These members have traditionally been designed as beams spanning between piles with the depth selected to avoid shear failures and the amount of longitudinal reinforcement selected to provide sufficient flexural capacity as calculated by the engineering beam theory. More recently, the strut-and-tie method has been used for the design of pile caps (disturbed or D-region) in which the load path is envisaged to be a three-dimensional truss, with compressive forces being supported by concrete compressive struts between the column and piles and tensile forces being carried by reinforcing steel located between piles. Both of these models have not provided uniform factors of safety against failure or been able to predict whether failure will occur by flexure (ductile mode) or shear (fragile mode). In this paper, an analytical model based on the strut-and-tie approach is presented. The proposed model has been calibrated using an extensive experimental database of pile caps subjected to compression and evaluated analytically for more complex loading conditions. It has been proven to be applicable across a broad range of test data and can predict the failures modes, cracking, yielding, and failure loads of four-pile caps with reasonable accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material`s strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel algorithm to successfully achieve viable integrity and authenticity addition and verification of n-frame DICOM medical images using cryptographic mechanisms. The aim of this work is the enhancement of DICOM security measures, especially for multiframe images. Current approaches have limitations that should be properly addressed for improved security. The algorithm proposed in this work uses data encryption to provide integrity and authenticity, along with digital signature. Relevant header data and digital signature are used as inputs to cipher the image. Therefore, one can only retrieve the original data if and only if the images and the inputs are correct. The encryption process itself is a cascading scheme, where a frame is ciphered with data related to the previous frames, generating also additional data on image integrity and authenticity. Decryption is similar to encryption, featuring also the standard security verification of the image. The implementation was done in JAVA, and a performance evaluation was carried out comparing the speed of the algorithm with other existing approaches. The evaluation showed a good performance of the algorithm, which is an encouraging result to use it in a real environment.