Exposes Model Context Protocol tools for interacting with local Ollama models.
The Ollama server acts as a bridge, making your locally installed Ollama language models accessible through the Model Context Protocol (MCP). This allows other MCP-compatible applications, like Claude Desktop or Cursor, to directly interact with and leverage your local models for tasks such as generating text, engaging in conversations with history, or managing your model library by pulling and deleting models.
Key Features
01List available Ollama models
02Engage in interactive chat with conversation history
03Generate responses from single prompts
04Download new models from the Ollama registry
05Remove models from local installation
060 GitHub stars
Use Cases
01Integrate local Ollama models into Claude Desktop applications
02Utilize local Ollama models within the Cursor editor for AI assistance
03Provide a standardized MCP interface for various applications to access local LLMs