EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs
a year ago
- #ICLR-2025
- #LLM
- #episodic-memory
- EM-LLM integrates human episodic memory and event cognition into LLMs to handle infinite context lengths efficiently.
- The architecture organizes token sequences into episodic events using Bayesian surprise and graph-theoretic boundary refinement.
- Memory retrieval combines similarity-based and temporally contiguous access for human-like information recall.
- Experiments show EM-LLM's performance on LongBench and extended passkey tasks compared to other methods.
- Installation requires Python packages and configuration via YAML files with detailed parameter settings.
- Key parameters include chunk size, memory block sizes, and retrieval settings for optimal performance.
- Evaluation can be run with scripts, accommodating different hardware setups and benchmarks.
- The paper on EM-LLM is cited for its contributions to LLM context handling and memory integration.