937 resultados para Block Cipher
Resumo:
This work examines the algebraic cryptanalysis of small scale variants of the LEX-BES. LEX-BES is a stream cipher based on the Advanced Encryption Standard (AES) block cipher. LEX is a generic method proposed for constructing a stream cipher from a block cipher, initially introduced by Biryukov at eSTREAM, the ECRYPT Stream Cipher project in 2005. The Big Encryption System (BES) is a block cipher introduced at CRYPTO 2002 which facilitates the algebraic analysis of the AES block cipher. In this article, experiments were conducted to find solutions of equation systems describing small scale LEX-BES using Gröbner Basis computations. This follows a similar approach to the work by Cid, Murphy and Robshaw at FSE 2005 that investigated algebraic cryptanalysis on small scale variants of the BES. The difference between LEX-BES and BES is that due to the way the keystream is extracted, the number of unknowns in LEX-BES equations is fewer than the number in BES. As far as the authors know, this attempt is the first at creating solvable equation systems for stream ciphers based on the LEX method using Gröbner Basis computations.
Resumo:
We examine the security of the 64-bit lightweight block cipher PRESENT-80 against related-key differential attacks. With a computer search we are able to prove that for any related-key differential characteristic on full-round PRESENT-80, the probability of the characteristic only in the 64-bit state is not higher than 2−64. To overcome the exponential (in the state and key sizes) computational complexity of the search we use truncated differences, however as the key schedule is not nibble oriented, we switch to actual differences and apply early abort techniques to prune the tree-based search. With a new method called extended split approach we are able to make the whole search feasible and we implement and run it in real time. Our approach targets the PRESENT-80 cipher however,with small modifications can be reused for other lightweight ciphers as well.
Resumo:
In this paper we present truncated differential analysis of reduced-round LBlock by computing the differential distribution of every nibble of the state. LLR statistical test is used as a tool to apply the distinguishing and key-recovery attacks. To build the distinguisher, all possible differences are traced through the cipher and the truncated differential probability distribution is determined for every output nibble. We concatenate additional rounds to the beginning and end of the truncated differential distribution to apply the key-recovery attack. By exploiting properties of the key schedule, we obtain a large overlap of key bits used in the beginning and final rounds. This allows us to significantly increase the differential probabilities and hence reduce the attack complexity. We validate the analysis by implementing the attack on LBlock reduced to 12 rounds. Finally, we apply single-key and related-key attacks on 18 and 21-round LBlock, respectively.
Resumo:
In this article, we study the security of the IDEA block cipher when it is used in various simple-length or double-length hashing modes. Even though this cipher is still considered as secure, we show that one should avoid its use as internal primitive for block cipher based hashing. In particular, we are able to generate instantaneously free-start collisions for most modes, and even semi-free-start collisions, pseudo-preimages or hash collisions in practical complexity. This work shows a practical example of the gap that exists between secret-key and known or chosen-key security for block ciphers. Moreover, we also settle the 20-year-old standing open question concerning the security of the Abreast-DM and Tandem-DM double-length compression functions, originally invented to be instantiated with IDEA. Our attacks have been verified experimentally and work even for strengthened versions of IDEA with any number of rounds.
Resumo:
Preneel, Govaerts and Vandewalle (PGV) analysed the security of single-block-length block cipher based compression functions assuming that the underlying block cipher has no weaknesses. They showed that 12 out of 64 possible compression functions are collision and (second) preimage resistant. Black, Rogaway and Shrimpton formally proved this result in the ideal cipher model. However, in the indifferentiability security framework introduced by Maurer, Renner and Holenstein, all these 12 schemes are easily differentiable from a fixed input-length random oracle (FIL-RO) even when their underlying block cipher is ideal. We address the problem of building indifferentiable compression functions from the PGV compression functions. We consider a general form of 64 PGV compression functions and replace the linear feed-forward operation in this generic PGV compression function with an ideal block cipher independent of the one used in the generic PGV construction. This modified construction is called a generic modified PGV (MPGV). We analyse indifferentiability of the generic MPGV construction in the ideal cipher model and show that 12 out of 64 MPGV compression functions in this framework are indifferentiable from a FIL-RO. To our knowledge, this is the first result showing that two independent block ciphers are sufficient to design indifferentiable single-block-length compression functions.
Resumo:
Efficient error-Propagating Block Chaining (EPBC) is a block cipher mode intended to simultaneously provide both confidentiality and integrity protection for messages. Mitchell’s analysis pointed out a weakness in the EPBC integrity mechanism that can be used in a forgery attack. This paper identifies and corrects a flaw in Mitchell’s analysis of EPBC, and presents other attacks on the EPBC integrity mechanism.
Resumo:
Grøstl is a SHA-3 candidate proposal. Grøstl is an iterated hash function with a compression function built from two fixed, large, distinct permutations. The design of Grøstl is transparent and based on principles very different from those used in the SHA-family. The two permutations are constructed using the wide trail design strategy, which makes it possible to give strong statements about the resistance of Grøstl against large classes of cryptanalytic attacks. Moreover, if these permutations are assumed to be ideal, there is a proof for the security of the hash function. Grøstl is a byte-oriented SP-network which borrows components from the AES. The S-box used is identical to the one used in the block cipher AES and the diffusion layers are constructed in a similar manner to those of the AES. As a consequence there is a very strong confusion and diffusion in Grøstl. Grøstl is a so-called wide-pipe construction where the size of the internal state is significantly larger than the size of the output. This has the effect that all known, generic attacks on the hash function are made much more difficult. Grøstl has good performance on a wide range of platforms and counter-measures against side-channel attacks are well-understood from similar work on the AES.
Resumo:
Grøstl is a SHA-3 candidate proposal. Grøstl is an iterated hash function with a compression function built from two �fixed, large, distinct permutations. The design of Grøstl is transparent and based on principles very different from those used in the SHA-family. The two permutations are constructed using the wide trail design strategy, which makes it possible to give strong statements about the resistance of Grøstl against large classes of cryptanalytic attacks. Moreover, if these permutations are assumed to be ideal, there is a proof for the security of the hash function. Grøstl is a byte-oriented SP-network which borrows components from the AES. The S-box used is identical to the one used in the block cipher AES and the diffusion layers are constructed in a similar manner to those of the AES. As a consequence there is a very strong confusion and diffusion in Grøstl
Resumo:
So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.
Resumo:
In this paper, we analyse a block cipher mode of operation submitted in 2014 to the cryptographic competition for authenticated encryption (CAESAR). This mode is designed by Recacha and called ++AE (plus-plus-ae). We propose a chosen plaintext forgery attack on ++AE that requires only a single chosen message query to allow an attacker to construct multiple forged messages. Our attack is deterministic and guaranteed to pass ++AE integrity check. We demonstrate the forgery attack using 128-bit AES as the underlying block cipher. Hence, ++AE is insecure as an authenticated encryption mode of operation.
Resumo:
分组密码在信息安全领域有着广泛的应用。密码算法标准化掀起了算法设计分析的新高潮,也使得密码算法的检测和评估成为新的研究热点。如何对分组密码进行科学合理的检测评估是一个复杂的系统工程,涉及到很多理论和技术问题。 本文对其中存在的关键问题:随机性检测项目相关性和参数选择、针对分组密码设计新的检测方法、快速检测技术以及合理实用的量化评估模型等内容进行了研究,取得了一系列的成果。这些研究成果将会促进密码算法检测分析自动化及密码评估实用化的发展,也会为密码算法标准化和密码检测标准化提供理论和技术支持。 具体而言,本文的研究成果可以总结为以下5个方面: 1) 对随机性检测项目相关性进行了研究:推导出了二元推导检测与自相关检测在参数为2k时的等价关系;提出了检测项目相关度的概念,并利用熵值法对检测项目相关度进行量化度量。该研究结果为随机性检测的项目选择提供了理论依据。 2) 对随机性检测参数选择进行了研究:首先推导出独立参数应满足的条件,以此为基础设计了一个用于检测两个参数之间依赖关系的假设检验算法。该研究结果为随机性检测的参数选择提供了一种可操作的手段。 3) 针对分组密码的特点,设计了一个新的统计检测方法。该方法以分组长度为基本单位,充分利用统计检测模型通用、自动化程度高的优势,可以一定程度上反映出分组密码抵抗积分攻击的能力。 4) 对分组密码组件检测方法的快速实现进行了研究,提出一个基于阈值过滤的S盒透明阶检测方法。该方法通过引入阈值去掉大量无效的运算,从而大幅提高了S盒透明阶的计算速度。该研究结果为大规模S盒的实用化提供了技术支持。 5) 对分组密码量化评估进行了研究,给出了一个基于模糊多准则决策的综合评估模型。其中采用隶属度函数对检测结果进行模糊化处理,能够反映出指标的连续、渐变特点,有效解决了单纯的阈值方法造成的评估信息丢失问题,也为量化评估提供了基础。该评估模型为分组密码的综合评估提供了新思路。
Resumo:
This paper studies the security of the block ciphers ARIA and Camellia against impossible differential cryptanalysis. Our work improves the best impossible differential cryptanalysis of ARIA and Camellia known so far. The designers of ARIA expected no impossible differentials exist for 4-round ARIA. However, we found some nontrivial 4-round impossible differentials, which may lead to a possible attack on 6-round ARIA. Moreover, we found some nontrivial 8-round impossible differentials for Camellia, whereas only 7-round impossible differentials were previously known. By using the 8-round impossible differentials, we presented an attack on 12-round Camellia without FL/FL 1 layers.
Resumo:
分组密码工作模式是利用分组密码解决实际问题的密码方案.好的工作模式可以弥补分组密码的某些缺憾;相反,不好的工作模式可能带来安全隐患.工作模式的研究始终伴随着分组密码的研究历史,新的分组密码标准的推出,都会伴随着相应工作模式的研究.从针对DES的ECB、CBC、CFB和OFB,到针对AES的CTR、CCM、CMAC、GCM和AESKW,作者以各种模式标准为主线,介绍分组密码工作模式的设计理念、安全模型、二十多年的研究成果以及发展现状.
Resumo:
检测评估是研究密码算法安全性的重要技术手段.随机特性是其中重要而实用的测评内容.针对密码算法的随机性,已有多种不同的检测方法,但是对繁杂的随机性检测结果,尚不存在一个完整实用的量化评估体系和模型.选择分组密码为实例,研究了对密码算法随机性的量化评估.根据分组密码的设计准则,提出一个分组密码随机性的评估指标体系,以模糊多准则决策为基础给出了一个实用的分组密码随机性评估模型.该模型采用模糊数学中的隶属度函数方法,对随机性检测结果进行模糊化处理,能够反映出随机性的连续和渐变特点,有效解决了单纯的阈值方法造成的评估信息丢失问题.该模型的优点是实现了对分组密码随机性的量化评估,为密码算法的综合评估提供基础.同时,给出了对单个指标和属性的通用的评估流程,因此,该模型也可稍加修改和扩展,应用于其他类型密码算法的随机性评估中.
Resumo:
本文推出了一个分组长度和密钥长度均为128bit的分组密码-AC(a cipher),它的整体结构是SP网络,加解密是相似的。AC分组密码的设计结合了宽轨迹策略和比特块技术,以确保算法对差分密码分析和线性密码分析的安全性。本文的目的是寻求公众对AC分组密码的测试、分析和评估。