About
This skill provides a standardized OpenAI-compatible interface for local LLMs running via Ollama. It allows developers to leverage the official OpenAI Python library, LangChain, and LlamaIndex without refactoring existing codebases. By supporting key endpoints for chat completions, text generation, and embeddings, it facilitates a smooth transition from cloud-based API services to local hosting, ensuring that tools designed for OpenAI work natively with your local model infrastructure.