I spent 2 weeks playing god. My learnings from 597 genetic algorithm lineages
2 days ago
- #genetic-algorithms
- #evolution-simulation
- #machine-learning
- The author built an evolution simulator where creatures learn to walk toward food over millions of generations.
- Creatures are made of nodes (spheres) connected by muscles (springs), with physics constraints like friction and stiffness.
- The fitness function was crucial and difficult to perfect, involving issues like progress banking, baseline position, and edge calculations.
- Initial creatures were brainless oscillators, but later versions included neural networks for environmental response.
- Various reproduction and mutation strategies were tested, including uniform crossover, interpolation, and single-point crossover.
- NEAT (NeuroEvolution of Augmenting Topologies) was implemented to allow evolving network topologies, improving creature behaviors.
- Neural Architecture Search (NAS) was used to optimize hyperparameters, with pure neural networks outperforming NEAT in some cases.
- Genetic algorithms were chosen over gradient descent due to the non-differentiable nature of the fitness function and the need for massive parallelism.
- Key lessons include the importance of understanding theory before implementation and the value of constraints in shaping evolution.
- Future directions include energy systems, recurrent connections, HyperNEAT, and novelty search.