Hasty Briefsbeta

Bilingual

Evolution: Training neural networks with genetic selection achieves 81% on MNIST

3 months ago
  • #evolutionary-learning
  • #neural-networks
  • #trust-based-selection
  • GENREG is an evolutionary learning system that optimizes neural networks through population-based selection without using gradients or backpropagation.
  • Networks accumulate 'trust' based on task performance, and high-trust genomes reproduce with mutations to create the next generation.
  • GENREG achieved 81.47% test accuracy on MNIST with 50,890 parameters and 100% accuracy on a letter recognition task.
  • Key differences from gradient-based training include no loss function derivatives, no learning rate, and population-based search.
  • Evolutionary learning requires stable fitness signals, achieved by averaging performance over multiple samples.
  • Child mutation is more important than base mutation for exploring high-trust genomes and maintaining population diversity.
  • GENREG demonstrates competitive performance with fewer parameters, suggesting neural networks are often overparameterized.
  • Training dynamics include rapid initial learning, steady mid-phase climb, and late-phase refinement.
  • Current limitations include slower speed compared to gradient descent and unclear scalability to high-resolution images.
  • Future work includes exploring convolutional architectures, multi-task learning, and theoretical analysis.