The ScrapeGraphAI app for Make.com lets you connect any automation scenario to ScrapeGraphβs v2 API β no code required. Fetch pages, extract structured data with an AI prompt, run web searches, kick off multi-page crawls, and schedule monitors, all as native Make modules.
This scenario runs daily, extracts all products from an Amazon search page, and saves each one as a row in Google Sheets β no code required.Full scenario flow:
Step 1 β Schedule trigger: Set the scenario to run daily (or any interval).Step 2 β Extract module: Configure with your target URL, an extraction prompt, and an output schema.
URL: The product listing page to extract from
Extraction Prompt: Extract all products on the page with their name, price, rating, and number of reviews
Step 3 β Iterator: Add a Flow Control β Iterator module and set the Array field to {{2.json.products}}. This loops through each product and passes it to the next module one at a time.
Step 4 β Google Sheets: Add a Row: Map each field from the Iterator output:
Name β {{value.name}}
Price β {{value.price}}
Rating β {{value.rating}}
Reviews β {{value.reviews}}
Result: Every product on the page is saved as a separate row.
Poll the status and results of a crawl job started by Start Crawl.
Field
Description
Crawl Job ID
The id output from Start Crawl β map with {{1.id}}
Returns status (running / completed / failed) and a pages array when completed.
Add a Tools β Sleep module (60 seconds) between Start Crawl and Get Crawl Status to give the crawl time to finish before polling. For large crawls, use two separate scenarios with a Make Data Store to persist the job ID.