Builds a production-ready Retrieval-Augmented Generation (RAG) application with Streamlit, LangChain, and ChromaDB, supporting MCP integration for tool calling and multiple LLM providers like OpenAI and Ollama.
Sponsored
This RAG application offers a robust solution for knowledge retrieval and generation, enabling users to interact with their documents in a conversational manner. Leveraging Streamlit for an intuitive UI, LangChain for powerful LLM orchestration, and ChromaDB for efficient vector storage, it seamlessly integrates the Model Context Protocol (MCP) to enhance tool calling capabilities. It supports a wide array of document formats and multiple LLM providers, making it a versatile and extensible platform for building intelligent chat assistants.
Key Features
01ChromaDB vector database with Ollama embeddings for semantic search
020 GitHub stars
03Support for multiple LLM providers including OpenAI and Ollama