Hasty Briefsbeta

Bilingual

Why are neural networks and cryptographic ciphers so similar? (2025)

2 days ago
  • #Cryptography
  • #Algorithm Design
  • #Neural Networks
  • Neural networks and cryptographic ciphers share similarities due to fundamental algorithmic parallels, not mere copying.
  • Both fields process sequences in similar ways: recurrent neural networks resemble the sponge construction in SHA-3 for sequential processing, while Transformers and fast Message Authentication Codes use parallel chunk processing with position encodings.
  • The core design involves alternating linear and nonlinear layers repeated many times, enabling mixing and complexity without degeneration.
  • Efficient mixing is achieved by organizing state as a grid and alternating between row and column operations, as seen in attention/feed-forward layers in neural nets and operations in AES or ChaCha20 ciphers.
  • Similarities stem from three key properties: weak correctness requirements (invertibility for cryptography, differentiability for neural networks), a focus on complexity and thorough mixing, and strong emphasis on hardware performance for parallelism and optimization.
  • These properties lead to convergent evolution in algorithms, producing structures like deeply parallel repeated-layer mixers, with potential for cross-field idea exchange, such as RevNets bringing Feistel networks to neural networks.