01Supports multiple LLM providers (OpenAI, LocalAI, Ollama, Anthropic, Mistral, DeepSeek, Google Gemini, and more)
02Offers scriptability via command-line arguments and terminal output
03Allows users to define custom tools for LLM interaction
04Includes built-in tools for system information, file manipulation, and command execution
05Can run as an MCP server to provide tool access to other applications
065 GitHub stars