Content Monitoring & Change Detection
Track price changes, content updates, and new pages across any website with scheduled scraping and webhook alerts.
The Problem
Businesses need to know when competitors change prices, when regulations update, or when news breaks. Manual checking is unreliable and doesn't scale:
- Competitor price changes that affect your market position
- Regulatory or policy page updates that require compliance action
- Content changes on partner or supplier sites
- New job listings, product launches, or press releases
Solution Architecture
Combine scheduled scraping with diff comparison and webhook notifications:
1. Schedule
POST /schedules with a cron expression to run scrapes on your desired frequency — hourly, daily, or weekly.
2. Compare
Disable caching with cache: false so each scheduled run returns fresh data. Compare with your stored baseline.
3. Alert
Use webhook_url to receive results as they complete. Your webhook handler compares and triggers notifications.
Quick Example
Scrape a page with caching disabled and compare against a stored snapshot to detect changes:
import requests
import hashlib
def check_for_changes(api_key, url, previous_hash):
"""Scrape a page and detect content changes."""
resp = requests.post(
"https://api.alterlab.io/api/v1/scrape",
headers={"X-API-Key": api_key},
json={
"url": url,
"cache": False,
"formats": ["text"]
}
)
text = resp.json().get("text", "")
current_hash = hashlib.sha256(text.encode()).hexdigest()
if current_hash != previous_hash:
return {"changed": True, "hash": current_hash, "content": text}
return {"changed": False, "hash": current_hash}
# First run: establish baseline
result = check_for_changes("YOUR_API_KEY", "https://example.com/pricing", "")
stored_hash = result["hash"]
# Subsequent runs: detect changes
result = check_for_changes("YOUR_API_KEY", "https://example.com/pricing", stored_hash)
if result["changed"]:
print("Content has changed — sending alert")Advanced Patterns
Scheduled Monitoring
Use the scheduler endpoint to automate recurring checks. Combine with a webhook to process results as they arrive:
import requests
# Create a schedule to monitor a page every hour
response = requests.post(
"https://api.alterlab.io/api/v1/schedules",
headers={"X-API-Key": "YOUR_API_KEY"},
json={
"name": "Competitor pricing check",
"url": "https://competitor.com/pricing",
"cron": "0 * * * *", # Every hour
"cache": False,
"extraction_schema": {
"type": "object",
"properties": {
"price": {"type": "number"},
"plan_name": {"type": "string"}
}
},
"webhook_url": "https://your-app.com/webhooks/monitor"
}
)
schedule = response.json()
print(f"Schedule created: {schedule.get('id')}")Diff Detection
For granular change detection, use extraction schemas to pull specific fields and compare structured data instead of raw text:
import requests
import json
def monitor_structured(api_key, url, schema, previous_data):
"""Monitor structured fields for changes."""
resp = requests.post(
"https://api.alterlab.io/api/v1/scrape",
headers={"X-API-Key": api_key},
json={
"url": url,
"cache": False,
"extraction_schema": schema
}
)
current = resp.json().get("filtered_content", {})
changes = {}
for key, value in current.items():
if previous_data.get(key) != value:
changes[key] = {
"old": previous_data.get(key),
"new": value
}
return {"data": current, "changes": changes}
# Monitor specific fields
schema = {
"type": "object",
"properties": {
"price": {"type": "number"},
"in_stock": {"type": "boolean"},
"shipping_estimate": {"type": "string"}
}
}
previous = {"price": 29.99, "in_stock": True, "shipping_estimate": "2-3 days"}
result = monitor_structured("YOUR_API_KEY", "https://store.com/product/1", schema, previous)
if result["changes"]:
for field, change in result["changes"].items():
print(f"{field}: {change['old']} -> {change['new']}")Cost Tip
extraction_schema to extract only the fields you care about. This is more efficient than comparing full page content and gives you structured diffs.Related Guides
Scheduler Guide
Automate recurring scrapes with cron expressions and balance-based limits.
Webhooks Guide
Get notified when async scrapes complete with webhook callbacks.
News Monitoring Tutorial
Build news aggregation and brand monitoring systems.
Caching Guide
Learn when to cache and when to disable caching for fresh data.