Show HN: Built a memory layer that stops AI agents from forgetting everything
5 days ago
- #AI coding assistants
- #pattern learning
- #persistent memory
- In Memoria solves AI coding tools' memory loss by providing persistent intelligence through the Model Context Protocol (MCP).
- Current AI tools re-analyze codebases every session, lack memory of architectural decisions, and can't learn from corrections.
- In Memoria runs as an MCP server with tools for codebase analysis and pattern learning, enhancing AI tools like Claude and Copilot.
- Core engines include AST Parser, Pattern Learner, Semantic Engine, and storage with SQLite and SurrealDB.
- Features include learning coding patterns, naming conventions, and architectural decisions to provide context-aware suggestions.
- Supports individual developers and teams, allowing sharing of intelligence and maintaining consistent AI suggestions.
- Performance is optimized with incremental analysis, cross-platform Rust binaries, and handling of large codebases.
- Comparison with GitHub Copilot, Cursor, and Custom RAG highlights In Memoria's advanced pattern learning and semantic analysis.
- Open-source with contributions welcome for language support, pattern learning improvements, and performance optimizations.
- No external data collection; all data stays local with minimal performance impact.