Make.com Integration
Complete guide to integrating HeadlessX with Make.com (formerly Integromat) automation
Automate web scraping with Make.com and HeadlessX. Build powerful visual automation scenarios without code.
Quick Setup
Step 1: Add HTTP Module
- Create a new scenario
- Add HTTP → Make a request module
- Configure connection
Step 2: Configure Request
URL: http://your-headlessx-server:3001/api/website/html
Method: POST
Headers:
{
"Content-Type": "application/json",
"X-API-Key": "your-api-key-here"
}
Body (JSON):
{
"url": "{{url}}",
"stealth": true
}
Step 3: Parse Response
- Add JSON → Parse JSON module
- Map output from HTTP module
- Access data:
{{data.html}},{{data.title}}
Complete Scenario Examples
Example 1: RSS Feed → Scrape → Google Sheets
Modules:
-
RSS - Watch RSS feed items
- Feed URL:
https://example.com/feed - Limit: 10
- Feed URL:
-
HTTP - Make a request
- URL:
http://localhost:3001/api/website/content - Method: POST
- Headers:
X-API-Key: your-key - Body:
{ "url": "{{1.url}}", "stealth": true }
- URL:
-
JSON - Parse JSON
- JSON string:
{{2.data}}
- JSON string:
-
Google Sheets - Add a row
- Spreadsheet: My Articles
- Sheet: Sheet1
- Values:
- Title:
{{1.title}} - Content:
{{3.data.markdown}} - URL:
{{1.url}} - Date:
{{1.pubDate}}
- Title:
Example 2: Scheduled Price Monitoring
Modules:
-
Tools - Set multiple variables
- Variables:
{ "products": [ "https://store.com/product1", "https://store.com/product2" ] }
- Variables:
-
Flow Control - Iterator
- Array:
{{1.products}}
- Array:
-
HTTP - Make a request
- URL:
http://localhost:3001/api/website/html - Body:
{ "url": "{{2.array}}", "stealth": true }
- URL:
-
Text parser - Match pattern
- Text:
{{3.data.html}} - Pattern:
<span class="price">\\$(\\d+\\.\\d+)</span>
- Text:
-
Google Sheets - Update a row
- If price changed
-
Gmail - Send an email
- Only if price dropped
Example 3: Content Republishing
Modules:
-
Webhooks - Custom webhook
- Create webhook URL
-
HTTP - Scrape article
- URL:
http://localhost:3001/api/website/content - Body:
{"url": "{{1.articleUrl}}"}
- URL:
-
OpenAI - Create a completion
- Prompt:
Rewrite this article:\n{{2.data.markdown}}
- Prompt:
-
HTTP - Upload images to S3
- Extract images from HTML
-
WordPress - Create a post
- Title:
{{2.data.title}} - Content:
{{3.choices[].message.content}}
- Title:
HeadlessX Custom App
Create a custom Make.com app for easier integration:
App Structure
Base URL: http://your-server:3001
Authentication: API Key
- Header:
X-API-Key - Value: User's API key
Modules
1. Scrape HTML
- Action: Make an API call
- Endpoint:
/api/website/html - Method: POST
- Input:
- URL (required)
- Stealth (boolean)
- Proxy (text)
- Output:
- success (boolean)
- data.html (text)
- data.title (text)
- data.statusCode (number)
2. Scrape with JavaScript
- Endpoint:
/api/website/html-js - Input:
- URL (required)
- Wait For (select: load, networkidle, domcontentloaded)
- Timeout (number)
- Output: Same as Scrape HTML
3. Get Markdown Content
- Endpoint:
/api/website/content - Output:
- data.markdown (text)
- data.text (text)
- data.wordCount (number)
4. Take Screenshot
- Endpoint:
/api/website/screenshot - Input:
- Full Page (boolean)
- Format (select: png, jpeg)
- Quality (number 0-100)
- Output:
- data.screenshot (base64)
- data.width (number)
- data.height (number)
5. Google Search
- Endpoint:
/api/google-serp/search - Input:
- Query (required)
- Location (text)
- Num Results (number)
- Output:
- data.organicResults (collection)
- data.featuredSnippet (object)
Triggers
New Scrape Completed
- Type: Webhook
- Endpoint:
/webhook/scrape-completed - Output: Scrape results
Advanced Scenarios
Scenario 1: E-commerce Monitoring
Schedule (every 6 hours)
→ Iterator (product URLs)
→ HTTP (Scrape product page)
→ Text Parser (Extract price)
→ Router
→ Path A: Price dropped
→ Email notification
→ Update Google Sheets
→ Path B: Out of stock
→ Slack alert
→ Path C: No change
→ Log to Data Store
Scenario 2: Lead Generation Pipeline
Google Sheets (Watch new rows)
→ HTTP (Scrape company website)
→ Text Parser (Extract email/phone)
→ OpenAI (Enrich data)
→ HubSpot (Create contact)
→ Gmail (Send outreach)
Scenario 3: Content Aggregator
Multiple RSS Modules (different sources)
→ Aggregator
→ HTTP (Scrape full articles)
→ OpenAI (Summarize)
→ Airtable (Store)
→ Twitter (Share best content)
Error Handling
Error Handler Route
Add error handler to HTTP module:
- Right-click HTTP module
- Add error handler
- Choose Break or Commit
Break: Stop scenario Commit: Continue with fallback
Retry Logic
Use Resuming error handler:
- Set max attempts: 3
- Set interval: 30 seconds
- Add delay between retries
Error Notification
HTTP Module
→ Error Handler
→ Gmail - Send error notification
→ Include error details
Data Transformation
Extract Data with Tools
Text parser - Match pattern:
Pattern: <title>(.*?)</title>
Input: {{scrapeResult.html}}
Output: {{title}}
JSON - Parse JSON:
Input: {{httpResponse.body}}
Output: Structured data
Transform with Functions
Get a variable:
{{substring({{html}}, 0, 100)}} // First 100 chars
{{replace({{text}}, "\n", " ")}} // Remove newlines
{{length({{array}})}} // Array length
Routers & Filters
Conditional Routing
Router after scrape:
Route 1: Success
- Filter:
{{success}} = true - Action: Process data
Route 2: CAPTCHA
- Filter:
{{error.code}} = CAPTCHA_DETECTED - Action: Alert admin
Route 3: Timeout
- Filter:
{{error.code}} = TIMEOUT_ERROR - Action: Retry with longer timeout
Data Stores
Cache Scrape Results
Modules:
-
Data store - Search records
- Key:
{{1.url}} - If found → Use cached data
- Key:
-
Router
- Route A: Cache hit → Return cached
- Route B: Cache miss → Scrape fresh
-
HTTP - Scrape (Route B only)
-
Data store - Add record
- Key:
{{1.url}} - Value:
{{3.data}} - TTL: 3600 (1 hour)
- Key:
Integrations with Popular Apps
Slack Integration
HTTP (Scrape)
→ Text parser (Extract key info)
→ Slack - Send message
→ Channel: #scraping-alerts
→ Message: "New data: {{info}}"
Google Sheets
Google Sheets (Watch new rows)
→ HTTP (Scrape each URL)
→ Google Sheets (Update row)
→ Status: "Scraped"
→ Data: {{result}}
Airtable
HTTP (Scrape)
→ Airtable - Create a record
→ Base: Website Data
→ Table: Scraped Pages
→ Fields:
- URL: {{url}}
- Title: {{title}}
- Content: {{markdown}}
Notion
HTTP (Scrape)
→ Notion - Create a page
→ Database: Knowledge Base
→ Properties:
- Title: {{title}}
- URL: {{url}}
- Content: {{markdown}}
Scheduling
Time-based Triggers
Schedule module options:
- Every X minutes: 15, 30, 60
- Every X hours: 1, 6, 12, 24
- Specific times: Daily at 9:00 AM
- Custom: Cron expression
Example Cron:
0 9 * * 1-5 # 9 AM, Monday-Friday
0 */6 * * * # Every 6 hours
Best Practices
1. Batch Processing
Don't scrape 1000 URLs at once:
- Use Iterator with batches of 5-10
- Add Sleep module between batches (2-5 seconds)
2. Error Resilience
- Always add error handlers
- Set retry logic (max 3 attempts)
- Log errors to Data Store
3. Data Validation
Add Filter after scrape:
- Check
success = true - Validate required fields exist
4. Resource Management
- Limit concurrent operations (3-5)
- Use Data Store for caching
- Clean up old data regularly
5. Monitoring
- Enable email notifications for errors
- Use Make's execution history
- Set up alerts for critical scenarios
Scenario Templates
Template 1: Daily News Digest
{
"name": "Daily News Scraper",
"modules": [
{
"type": "schedule",
"schedule": "0 8 * * *"
},
{
"type": "iterator",
"array": ["url1", "url2", "url3"]
},
{
"type": "http",
"url": "http://localhost:3001/api/website/content",
"body": {"url": "{{item}}"}
},
{
"type": "aggregator",
"combineArticles": true
},
{
"type": "gmail",
"subject": "Daily News Digest",
"body": "{{articles}}"
}
]
}
Template 2: Product Price Tracker
{
"name": "Price Monitor",
"modules": [
{
"type": "schedule",
"interval": 360
},
{
"type": "googleSheets",
"operation": "getRows",
"sheet": "Products"
},
{
"type": "http",
"url": "http://localhost:3001/api/website/html"
},
{
"type": "textParser",
"pattern": "price"
},
{
"type": "router",
"routes": [
{
"filter": "priceDropped",
"action": "sendAlert"
}
]
}
]
}
Troubleshooting
Scenario Not Running
- Check scenario is ON
- Verify trigger is configured
- Review execution history
HTTP Request Fails
- Test URL manually:
curl http://localhost:3001/health - Verify API key in headers
- Check request body format
Data Not Mapping
- Click Choose where to map
- Test previous module
- Verify field names match
Rate Limits
Make.com limits:
- Free: 1,000 operations/month
- Core: 10,000 operations/month
- Pro: 100,000+ operations/month
Pricing & Limits
| Plan | Operations/Month | Scenarios | Data Transfer |
|---|---|---|---|
| Free | 1,000 | 2 | 100 MB |
| Core | 10,000 | Unlimited | 1 GB |
| Pro | 100,000 | Unlimited | 10 GB |
| Teams | 1,000,000 | Unlimited | 100 GB |
1 Operation = 1 module execution