Empower your AI agents with persistent, contextual memory using this Model Context Protocol (MCP) server. Designed for production environments, it enables AI agents to seamlessly store, retrieve, and manage knowledge across sessions. It features a robust stack including TypeScript, PostgreSQL with pgvector for efficient semantic search, and local embedding generation via Transformers.js to eliminate external API costs. With intelligent caching, multi-agent support, memory relationships, and async processing, it provides a performant and scalable solution for building smarter, more context-aware AI applications.
Key Features
01Advanced features like memory consolidation (clustering) and asynchronous job processing for background tasks
0223 GitHub stars
03Multi-agent support with user context isolation and memory relationships via a graph structure
04Local embedding generation using Transformers.js (no external API calls)
05Semantic memory management with PostgreSQL + pgvector for vector similarity search
06Intelligent two-tier caching with Redis and in-memory fallback for blazing fast performance