ScrapeGraph MCP Server
⭐ Star us on GitHub
If this server is helpful, a star goes a long way. Thanks!
Key Features
- 8 tools covering markdown conversion, AI extraction, search, crawling, sitemap, and agentic flows
- Remote HTTP MCP endpoint and local Python server support
- Works with Cursor, Claude Desktop, and any MCP‑compatible client
- Robust error handling, timeouts, and production‑tested reliability
Get Your API Key
Create an account and copy your API key from the ScrapeGraph Dashboard.Recommended: Use the Remote MCP Server
Endpoint:Cursor (HTTP MCP)
Add this to your Cursor MCP settings (~/.cursor/mcp.json):
Claude Desktop (via mcp-remote)
Claude Desktop connects to HTTP MCP via a lightweight proxy. Add the following to~/Library/Application Support/Claude/claude_desktop_config.json on macOS (adjust path on Windows):
Smithery (optional)
Local Usage (Python)
Prefer running locally? Install and wire the server via stdio.Install
Run the server
Available Tools
The server exposes 8 enterprise‑ready tools:1. markdownify
Convert a webpage to clean markdown.2. smartscraper
AI‑powered extraction with optional infinite scrolls.3. searchscraper
Search the web and extract structured results.4. scrape
Fetch raw HTML from a URL.5. sitemap
Discover a site’s URLs and structure.6. smartcrawler_initiate
Start an async multi‑page crawl (AI or markdown mode).7. smartcrawler_fetch_results
Poll results using the returned request_id.8. agentic_scrapper
Agentic, multi‑step workflows with optional schema and session persistence.Troubleshooting
- Verify your key is present in config (
X-API-Keyfor remote,SGAI_API_KEYfor local). - Claude Desktop logs:
- macOS:
~/Library/Logs/Claude/ - Windows:
%APPDATA%\\Claude\\Logs\\
- macOS:
- If a long crawl is “still running”, keep polling
smartcrawler_fetch_results.