Provides a comprehensive server for the Model Context Protocol, integrating self-hosted LLM models via Ollama with a Supabase database for robust data persistence and retrieval.
Sponsored
This tool serves as a Model Context Protocol (MCP) server, designed to bridge the gap between large language models (LLMs) and structured databases. It enables seamless interaction with self-hosted LLMs like Llama2 and CodeLlama through Ollama, while leveraging Supabase for all data operations including storage, retrieval, and querying. Developers can utilize its well-defined API and integrated tools to build powerful AI applications that require both dynamic text generation and reliable data management, all within a containerized and easily deployable environment.
Key Features
01Comprehensive CRUD operations with Supabase database
02Seamless integration with self-hosted Ollama-based LLM models
03Full implementation of the Model Context Protocol specification
040 GitHub stars
05Containerized deployment using Docker and Docker Compose
06Extensive testing suite including unit, integration, and E2E tests
Use Cases
01Executing SQL queries on Supabase databases with contextual information from LLMs.
02Generating dynamic text responses and content using self-hosted large language models.
03Storing and retrieving structured application data within Supabase through an intelligent API.