Discover our curated collection of MCP servers for data science & ml. Browse 6536servers and find the perfect MCPs for your needs.
Enables semantic search of Elasticsearch documents, specifically tailored for Search Labs blog posts.
Facilitates integration with large language models by providing a backend service with file access, database connectivity, API integration, and vector database access.
Enables interaction with Jupyter notebooks running in a local JupyterLab environment using the Model Context Protocol (MCP).
Exposes API Market's endpoints as Model Context Protocol (MCP) resources, allowing Large Language Models to discover and interact with over 200 APIs defined by OpenAPI specifications.
Facilitates extraction of meaningful information from research papers via the Unstructured API for fine-tuning language models and reducing literature review time.
Provides convenient access to the Finbud Data REST API from server-side TypeScript or JavaScript.
Performs geospatial calculations and coordinate system conversions.
Enables AI agents to perform web searches via the Bocha API using the Model Context Protocol.
Enables querying of the Consumer Financial Protection Bureau's (CFPB) Consumer Complaint Database API from within applications like Claude Desktop.
Enables true continuity of consciousness for AI systems by providing sophisticated persistent memory capabilities.
Enables FastAPI applications to expose their endpoints as an AI-friendly, introspectable server using the Model Context Protocol.
Fetches gene and protein metadata from the NCBI Entrez API, functioning as both a command-line tool and an MCP server.
Empowers AI coding assistants with semantic search across Apple's comprehensive developer documentation, WWDC transcripts, and code examples.
Deploys an agentic AI architecture on AWS Fargate for Amazon ECS, enabling an AI service to interact with multiple Model Context Protocol servers to perform various actions.
Ingest source code and documentation to build a local vector index, exposing semantic search and file access to LLMs via Model Context Protocol (MCP) tools.
Builds self-contained, searchable documentation servers with hybrid vector and keyword search capabilities.
Automatically generates and updates comprehensive documentation for your codebase using AI.
Aggregates and provides persistent, searchable memory across all your AI coding agent sessions in a single vector database.
Connects Large Language Models (LLMs) to the Binspire API, offering standardized tools and contextual data for developing autonomous, AI-driven waste management agents.
Enables AI models to access and query jOOQ documentation, including SQL examples and best practices.
Scroll for more results...