Hasty Briefsbeta

Using information theory to solve Mastermind

17 days ago
  • #Game Strategy
  • #Mastermind
  • #Information Theory
  • Mastermind is a game where the Code Master selects a secret code from 1,296 possible combinations.
  • Information theory quantifies the information provided by each guess using the formula \( I := \log_2 \frac{n}{n'} \), where bits are the unit of information.
  • The strategy involves making guesses that maximize the expected information, known as entropy, calculated as \( H := \sum p_i \log_2 \frac{1}{p_i} \).
  • On average, this strategy solves Mastermind in 4.47 guesses, close to the theoretical minimum of 4 guesses needed for 10.3 bits of information.
  • Donald Knuth's 1976 analysis and subsequent studies show similar performance, with entropy maximization being a brute-force but effective approach.
  • The computational complexity of calculating possible codes after each guess is NP-complete, making the problem challenging for longer codes or more colors.
  • The concept of entropy in Mastermind is inspired by 3Blue1Brown's video on Wordle, another game similar to Mastermind.
  • All code for this project is available on GitHub, with the article published on June 11, 2025.