01Integration with any OpenAI-compatible LLM service
02Supports chat completion, model listing, and health checks
03Full implementation of Model Context Protocol
040 GitHub stars
05Configurable via environment variables
06Provides both streaming and non-streaming responses