DOAJ Open Access 2025

Differential Cryptanalysis Based on Transformer Model and Attention Mechanism

XIAO Chaoen, LI Zifan, ZHANG Lei, WANG Jianxin, QIAN Siyuan

Abstrak

In differential analysis-based cryptographic attacks, Bayesian optimization is typically used to verify whether the partially decrypted data exhibit differential characteristics. Currently, the primary approach involves training a differential distinguisher using deep learning techniques. However, this method has a notable limitation in that, as the number of encryption rounds increases, the accuracy of the differential characteristics decreases linearly. Therefore, a new differential characteristic discrimination method is proposed based on the attention mechanism and side-channel analysis. Using the difference relationship between multiple rounds of the ciphertext, a difference partition for the SPECK32/64 algorithm is trained based on the transformer. In a key recovery attack, a novel scheme is designed based on the previous ciphertext treatment to distinguish the most influential features of the ciphertext. In the key recovery attack of the SPECK32/64 algorithm, 2<sup>6</sup> selected ciphertext pairs are used. Using the 20th round ciphertext pairs, the 65 536 candidate keys of the 22nd round can be screened within 17 on average, and the key recovery attack of the last two wheels can be completed. The experimental results show that this method achieves a success rate of 90%, effectively addressing the challenge of recognizing ciphertext differential features caused by an increase in the number of encryption rounds.

Penulis (1)

X

XIAO Chaoen, LI Zifan, ZHANG Lei, WANG Jianxin, QIAN Siyuan

Format Sitasi

Siyuan, X.C.L.Z.Z.L.W.J.Q. (2025). Differential Cryptanalysis Based on Transformer Model and Attention Mechanism. https://doi.org/10.19678/j.issn.1000-3428.0068486

Akses Cepat

Informasi Jurnal
Tahun Terbit
2025
Sumber Database
DOAJ
DOI
10.19678/j.issn.1000-3428.0068486
Akses
Open Access ✓