01Uses Ollama to run LLMs locally, ensuring data privacy.
02Employs the Model Context Protocol (MCP) for secure tool calling.
03Includes a Gradio-based chat interface for easy interaction.
04Utilizes a local SQLite database for data storage and management.
05Exposes database tools via FastMCP for LLM access.
060 GitHub stars