990 resultados para Lattice-based cryptosystems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Keyword Spotting is the task of detecting keywords of interest within continu- ous speech. The applications of this technology range from call centre dialogue systems to covert speech surveillance devices. Keyword spotting is particularly well suited to data mining tasks such as real-time keyword monitoring and unre- stricted vocabulary audio document indexing. However, to date, many keyword spotting approaches have su®ered from poor detection rates, high false alarm rates, or slow execution times, thus reducing their commercial viability. This work investigates the application of keyword spotting to data mining tasks. The thesis makes a number of major contributions to the ¯eld of keyword spotting. The ¯rst major contribution is the development of a novel keyword veri¯cation method named Cohort Word Veri¯cation. This method combines high level lin- guistic information with cohort-based veri¯cation techniques to obtain dramatic improvements in veri¯cation performance, in particular for the problematic short duration target word class. The second major contribution is the development of a novel audio document indexing technique named Dynamic Match Lattice Spotting. This technique aug- ments lattice-based audio indexing principles with dynamic sequence matching techniques to provide robustness to erroneous lattice realisations. The resulting algorithm obtains signi¯cant improvement in detection rate over lattice-based audio document indexing while still maintaining extremely fast search speeds. The third major contribution is the study of multiple veri¯er fusion for the task of keyword veri¯cation. The reported experiments demonstrate that substantial improvements in veri¯cation performance can be obtained through the fusion of multiple keyword veri¯ers. The research focuses on combinations of speech background model based veri¯ers and cohort word veri¯ers. The ¯nal major contribution is a comprehensive study of the e®ects of limited training data for keyword spotting. This study is performed with consideration as to how these e®ects impact the immediate development and deployment of speech technologies for non-English languages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study MCF-7 breast cancer cell movement in a transwell apparatus. Various experimental conditions lead to a variety of monotone and nonmonotone responses which are difficult to interpret. We anticipate that the experimental results could be caused by cell-to-cell adhesion or volume exclusion. Without any modeling, it is impossible to understand the relative roles played by these two mechanisms. A lattice-based exclusion process random-walk model incorporating agent-to-agent adhesion is applied to the experimental system. Our combined experimental and modeling approach shows that a low value of cell-to-cell adhesion strength provides the best explanation of the experimental data suggesting that volume exclusion plays a more important role than cell-to-cell adhesion. This combined experimental and modeling study gives insight into the cell-level details and design of transwell assays.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Continuum diffusion models are often used to represent the collective motion of cell populations. Most previous studies have simply used linear diffusion to represent collective cell spreading, while others found that degenerate nonlinear diffusion provides a better match to experimental cell density profiles. In the cell modeling literature there is no guidance available with regard to which approach is more appropriate for representing the spreading of cell populations. Furthermore, there is no knowledge of particular experimental measurements that can be made to distinguish between situations where these two models are appropriate. Here we provide a link between individual-based and continuum models using a multi-scale approach in which we analyze the collective motion of a population of interacting agents in a generalized lattice-based exclusion process. For round agents that occupy a single lattice site, we find that the relevant continuum description of the system is a linear diffusion equation, whereas for elongated rod-shaped agents that occupy L adjacent lattice sites we find that the relevant continuum description is connected to the porous media equation (pme). The exponent in the nonlinear diffusivity function is related to the aspect ratio of the agents. Our work provides a physical connection between modeling collective cell spreading and the use of either the linear diffusion equation or the pme to represent cell density profiles. Results suggest that when using continuum models to represent cell population spreading, we should take care to account for variations in the cell aspect ratio because different aspect ratios lead to different continuum models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The contributions of this thesis fall into three areas of certificateless cryptography. The first area is encryption, where we propose new constructions for both identity-based and certificateless cryptography. We construct an n-out-of- n group encryption scheme for identity-based cryptography that does not require any special means to generate the keys of the trusted authorities that are participating. We also introduce a new security definition for chosen ciphertext secure multi-key encryption. We prove that our construction is secure as long as at least one authority is uncompromised, and show that the existing constructions for chosen ciphertext security from identity-based encryption also hold in the group encryption case. We then consider certificateless encryption as the special case of 2-out-of-2 group encryption and give constructions for highly efficient certificateless schemes in the standard model. Among these is the first construction of a lattice-based certificateless encryption scheme. Our next contribution is a highly efficient certificateless key encapsulation mechanism (KEM), that we prove secure in the standard model. We introduce a new way of proving the security of certificateless schemes based that are based on identity-based schemes. We leave the identity-based part of the proof intact, and just extend it to cover the part that is introduced by the certificateless scheme. We show that our construction is more efficient than any instanciation of generic constructions for certificateless key encapsulation in the standard model. The third area where the thesis contributes to the advancement of certificateless cryptography is key agreement. Swanson showed that many certificateless key agreement schemes are insecure if considered in a reasonable security model. We propose the first provably secure certificateless key agreement schemes in the strongest model for certificateless key agreement. We extend Swanson's definition for certificateless key agreement and give more power to the adversary. Our new schemes are secure as long as each party has at least one uncompromised secret. Our first construction is in the random oracle model and gives the adversary slightly more capabilities than our second construction in the standard model. Interestingly, our standard model construction is as efficient as the random oracle model construction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Moving fronts of cells are essential features of embryonic development, wound repair and cancer metastasis. This paper describes a set of experiments to investigate the roles of random motility and proliferation in driving the spread of an initially confined cell population. The experiments include an analysis of cell spreading when proliferation was inhibited. Our data have been analysed using two mathematical models: a lattice-based discrete model and a related continuum partial differential equation model. We obtain independent estimates of the random motility parameter, D, and the intrinsic proliferation rate, λ, and we confirm that these estimates lead to accurate modelling predictions of the position of the leading edge of the moving front as well as the evolution of the cell density profiles. Previous work suggests that systems with a high λ/D ratio will be characterized by steep fronts, whereas systems with a low λ/D ratio will lead to shallow diffuse fronts and this is confirmed in the present study. Our results provide evidence that continuum models, based on the Fisher–Kolmogorov equation, are a reliable platform upon which we can interpret and predict such experimental observations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Distributed-password public-key cryptography (DPwPKC) allows the members of a group of people, each one holding a small secret password only, to help a leader to perform the private operation, associated to a public-key cryptosystem. Abdalla et al. recently defined this tool [1], with a practical construction. Unfortunately, the latter applied to the ElGamal decryption only, and relied on the DDH assumption, excluding any recent pairing-based cryptosystems. In this paper, we extend their techniques to support, and exploit, pairing-based properties: we take advantage of pairing-friendly groups to obtain efficient (simulation-sound) zero-knowledge proofs, whose security relies on the Decisional Linear assumption. As a consequence, we provide efficient protocols, secure in the standard model, for ElGamal decryption as in [1], but also for Linear decryption, as well as extraction of several identity-based cryptosystems [6,4]. Furthermore, we strenghten their security model by suppressing the useless testPwd queries in the functionality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We show that the LASH-x hash function is vulnerable to attacks that trade time for memory, including collision attacks as fast as 2(4x/11) and preimage attacks as fast as 2(4x/7). Moreover, we briefly mention heuristic lattice based collision attacks that use small memory but require very long messages that are expected to find collisions much faster than 2 x/2. All of these attacks exploit the designers’ choice of an all zero IV. We then consider whether LASH can be patched simply by changing the IV. In this case, we show that LASH is vulnerable to a 2(7x/8) preimage attack. We also show that LASH is trivially not a PRF when any subset of input bytes is used as a secret key. None of our attacks depend upon the particular contents of the LASH matrix – we only assume that the distribution of elements is more or less uniform.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

NTRUEncrypt is a fast and practical lattice-based public-key encryption scheme, which has been standardized by IEEE, but until recently, its security analysis relied only on heuristic arguments. Recently, Stehlé and Steinfeld showed that a slight variant (that we call pNE) could be proven to be secure under chosen-plaintext attack (IND-CPA), assuming the hardness of worst-case problems in ideal lattices. We present a variant of pNE called NTRUCCA, that is IND-CCA2 secure in the standard model assuming the hardness of worst-case problems in ideal lattices, and only incurs a constant factor overhead in ciphertext and key length over the pNE scheme. To our knowledge, our result gives the first IND-CCA2 secure variant of NTRUEncrypt in the standard model, based on standard cryptographic assumptions. As an intermediate step, we present a construction for an All-But-One (ABO) lossy trapdoor function from pNE, which may be of independent interest. Our scheme uses the lossy trapdoor function framework of Peikert and Waters, which we generalize to the case of (k − 1)-of-k-correlated input distributions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lattice-based cryptographic primitives are believed to offer resilience against attacks by quantum computers. We demonstrate the practicality of post-quantum key exchange by constructing cipher suites for the Transport Layer Security (TLS) protocol that provide key exchange based on the ring learning with errors (R-LWE) problem, we accompany these cipher suites with a rigorous proof of security. Our approach ties lattice-based key exchange together with traditional authentication using RSA or elliptic curve digital signatures: the post-quantum key exchange provides forward secrecy against future quantum attackers, while authentication can be provided using RSA keys that are issued by today's commercial certificate authorities, smoothing the path to adoption. Our cryptographically secure implementation, aimed at the 128-bit security level, reveals that the performance price when switching from non-quantum-safe key exchange is not too high. With our R-LWE cipher suites integrated into the Open SSL library and using the Apache web server on a 2-core desktop computer, we could serve 506 RLWE-ECDSA-AES128-GCM-SHA256 HTTPS connections per second for a 10 KiB payload. Compared to elliptic curve Diffie-Hellman, this means an 8 KiB increased handshake size and a reduction in throughput of only 21%. This demonstrates that provably secure post-quantum key-exchange can already be considered practical.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Enthused by the fascinating properties of graphene, we have prepared graphene analogues of BN by a chemical method with a control on the number of layers. The method involves the reaction of boric acid with urea, wherein the relative proportions of the two have been varied over a wide range. Synthesis with a high proportion of urea yields a product with a majority of 1-4 layers. The surface area of BN increases progressively with the decreasing number of layers, and the high surface area BN exhibits high CO, adsorption, but negligible H, adsorption. Few-layer BN has been solubilized by interaction with Lewis bases. We have used first-principles simulations to determine structure, phonon dispersion, and elastic properties of BN with planar honeycomb lattice-based n-layer forms. We find that the mechanical stability of BN with respect to out-of-plane deformation is quite different from that of graphene, as evident in the dispersion of their flexural modes. BN is softer than graphene and exhibits signatures of long-range ionic interactions in its optical phonons. Finally, structures with different stacking sequences of BN have comparable energies, suggesting relative abundance of slip faults, stacking faults, and structural inhomogeneities in multilayer BN.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since its induction, the selective-identity (sID) model for identity-based cryptosystems and its relationship with various other notions of security has been extensively studied. As a result, it is a general consensus that the sID model is much weaker than the full-identity (ID) model. In this paper, we study the sID model for the particular case of identity-based signatures (IBS). The main focus is on the problem of constructing an ID-secure IBS given an sID-secure IBS without using random oracles-the so-called standard model-and with reasonable security degradation. We accomplish this by devising a generic construction which uses as black-box: i) a chameleon hash function and ii) a weakly-secure public-key signature. We argue that the resulting IBS is ID-secure but with a tightness gap of O(q(s)), where q(s) is the upper bound on the number of signature queries that the adversary is allowed to make. To the best of our knowledge, this is the first attempt at such a generic construction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper discusses the Cambridge University HTK (CU-HTK) system for the automatic transcription of conversational telephone speech. A detailed discussion of the most important techniques in front-end processing, acoustic modeling and model training, language and pronunciation modeling are presented. These include the use of conversation side based cepstral normalization, vocal tract length normalization, heteroscedastic linear discriminant analysis for feature projection, minimum phone error training and speaker adaptive training, lattice-based model adaptation, confusion network based decoding and confidence score estimation, pronunciation selection, language model interpolation, and class based language models. The transcription system developed for participation in the 2002 NIST Rich Transcription evaluations of English conversational telephone speech data is presented in detail. In this evaluation the CU-HTK system gave an overall word error rate of 23.9%, which was the best performance by a statistically significant margin. Further details on the derivation of faster systems with moderate performance degradation are discussed in the context of the 2002 CU-HTK 10 × RT conversational speech transcription system. © 2005 IEEE.