01Integration with Ollama for local LLM usage
02Comprehensive tracing and monitoring with LangSmith
03Iterative research process with multiple cycles
04Supports Tavily and Perplexity APIs for web search
05Research results stored as MCP resources for persistent access