01Significant context window savings for LLMs (90%+)
02MCP server compatibility for seamless integration with AI tools
03BM25 + vector hybrid search with Reciprocal Rank Fusion
042 GitHub stars
05Offline-first operation with local SQLite storage
06Persistent project memory for AI assistants