n8n Node
Community node for workflow automation. Drag a node onto your canvas, connect your API key, and pipe live web data anywhere — no code required.
Overview
n8n-nodes-alterlab is a community node published on npm. Once installed, an AlterLab node appears in the n8n node picker. Drop it into any workflow to scrape a URL, extract structured data with a JSON schema or natural language prompt, and wire the result to Slack, Google Sheets, a database, or any other n8n-connected service.
No Code
Configure scrape targets, output formats, and extraction rules entirely through n8n's visual UI.
Full Anti-Bot
Every node execution runs through AlterLab's tier escalation — Cloudflare, CAPTCHA, and JS-heavy sites handled automatically.
Any Destination
Feed scraped data directly to Slack, Google Sheets, Postgres, Airtable, webhooks, or any of n8n's 400+ integrations.
Installation
There are three ways to add the node depending on how you run n8n.
Option 1 — n8n Cloud or Docker (recommended)
Use the built-in community node installer in the n8n UI:
- Open your n8n instance and go to Settings → Community Nodes.
- Click Install a community node.
- Enter
n8n-nodes-alterlaband click Install. - Reload the editor — the AlterLab node will appear in the palette.
n8n Cloud restriction
Option 2 — Self-hosted CLI
Install directly into the n8n global package from the same Node.js environment where n8n runs, then restart n8n:
npm install n8n-nodes-alterlab
# Restart n8n after installation
n8n startOption 3 — Build from source
For local development or to pin a specific commit:
git clone https://github.com/RapierCraft/n8n-nodes-alterlab.git
cd n8n-nodes-alterlab
npm install
npm run build
# Link into your local n8n install
npm link
cd /path/to/your/n8n
npm link n8n-nodes-alterlabAuthentication
The AlterLab node supports two credential types. Both are configured under Credentials → New Credential in n8n.
API Key
Best for personal workflows or single-user setups.
- Go to your AlterLab Dashboard → API Keys and copy your key.
- In n8n, create a new AlterLab API credential.
- Paste the key into the API Key field and save.
OAuth2
Best for shared n8n instances where multiple team members run workflows.
- Create a new AlterLab OAuth2 credential in n8n.
- Click Connect — you will be redirected to AlterLab to sign in.
- Approve access. n8n stores and refreshes the token automatically.
Operations
The node exposes two operations selectable from the Operation dropdown inside the node editor.
Scrape
Fetches a URL and returns its content in one or more output formats. Supports full anti-bot tier escalation, JS rendering, proxies, screenshot capture, and structured data extraction. This is the operation you will use for the vast majority of workflows.
Estimate Cost
Previews the credit cost for a scrape request without actually executing it. Useful for building cost-gate logic in workflows — for example, only scraping when the estimate falls within budget. Returns a breakdown of base cost plus any add-on charges (rendering, proxies, screenshots).
Output Options
Configure what the node returns under the Output section of the node editor.
| Parameter | Values | Description |
|---|---|---|
formats | markdown, json, html, text | One or more output formats. Default: [markdown, json]. JSON includes structured metadata; markdown is best for LLM pipelines. |
timeout | 1–300 seconds | Maximum time to wait for page load and rendering. Default: 90. Increase for slow or JS-heavy pages. |
cache | enabled/disabled, TTL 60–86400s | Enable response caching to avoid re-scraping the same URL within a workflow run. Set the TTL in seconds (default disabled). Cached hits use zero credits. |
Advanced Options
Expand the Advanced section in the node editor to access rendering, proxy, and interaction controls.
renderJsEnable headless browser rendering for JavaScript-heavy pages (React, Vue, SPAs). Adds +$0.0006 per request. Required for sites where content loads after the initial HTML response.
screenshotCapture a full-page PNG screenshot alongside the scrape result. The image is returned as a base64-encoded string in the node output, ready to attach to emails or upload to storage.
useProxyRoute the request through a residential proxy pool. Improves success rates on geo-restricted or bot-protected sites. Pair with proxyCountry to target a specific region.
proxyCountryTwo-letter country code for geo-targeted proxy exit nodes. Supported values include US, DE, GB, JP, FR, CA, AU, and 30+ more. Requires useProxy to be enabled.
waitConditionControls when the headless browser considers the page ready. Options: networkidle (no network activity for 500ms — most reliable), domcontentloaded (fast, before deferred scripts run), load (all resources including images). Only relevant when renderJs is enabled.
removeCookieBannersAutomatically detect and dismiss cookie consent overlays before capturing content. Enabled by default. Disable only if the site requires a specific consent state for content access.
Extraction & Structured Data
The node can extract structured data from scraped pages using three approaches, configured under the Extraction section.
Built-in profiles
Select a pre-built extraction profile to instantly get clean, structured output for common page types — no schema writing required.
autoproductarticlejob_postingfaqrecipeeventThe auto profile detects the page type and applies the best matching extractor. product returns name, price, currency, availability, images, and reviews. article returns title, author, publish date, and body text.
Custom JSON Schema
Define exactly what fields to extract by providing a JSON Schema. AlterLab maps page content to your schema and returns a validated JSON object. Ideal for repeatable pipelines where you own the output shape.
{
"type": "object",
"properties": {
"title": { "type": "string" },
"price": { "type": "number" },
"currency": { "type": "string" },
"in_stock": { "type": "boolean" },
"rating": { "type": "number" },
"review_count":{ "type": "integer" }
},
"required": ["title", "price"]
}Natural language prompts
Describe what you want in plain English. The extraction engine interprets the prompt against the scraped content and returns the result as structured JSON. Good for one-off or exploratory workflows.
Extract the product name, current sale price, original price,
discount percentage, and whether the item ships for free.Extraction credits
Cost Controls
Use these parameters to cap spend and tune the tier escalation behaviour for each node. Found under Cost Controls in the node editor.
| Parameter | Type | Description |
|---|---|---|
maxCredits | number | Hard credit ceiling per request. If escalation would exceed this, the request fails fast rather than spending over budget. Useful for bulk workflows where a runaway cost is unacceptable. |
forceTier | 1 – 4 | Skip escalation and run exactly this tier. Use when you know the target always needs a specific tier (e.g., forceTier: 3 for a known Cloudflare-protected site). |
maxTier | 1 – 4 | Allow escalation but cap it at this tier. Balances success rate against cost — for example, maxTier: 2 prevents spending on premium anti-bot passes for low-priority sources. |
preferCost | boolean | Hint to the escalation engine to bias toward lower-cost tiers when multiple options have similar predicted success rates. Mutually exclusive with preferSpeed. |
preferSpeed | boolean | Hint to bias toward faster tiers even if they cost slightly more. Use for time-sensitive monitoring workflows where latency matters more than per-request cost. |
forceTier overrides intelligence
forceTier bypasses AlterLab's learned domain strategies. Only use it when you have confirmed the tier requirement from prior runs or from Cortex analytics.Example Workflows
These are common patterns that n8n users build with the AlterLab node. Import any of them from the n8n workflow library by searching AlterLab.
Scrape product prices → Google Sheets
A Schedule Trigger fires daily. The AlterLab node scrapes a list of product URLs (supplied by a Code node or a Google Sheets lookup) using the product extraction profile. The resulting {name, price, in_stock} objects are written back to a Google Sheets row via the Google Sheets node. A final IF node sends a Slack alert if any price dropped more than 10%.
Monitor news articles → Slack digest
An RSS Trigger (or Webhook) receives new article URLs. The AlterLab node scrapes each URL with the article profile, extracting title, author, publish date, and a clean markdown body. A Summarize node (or an OpenAI node) condenses the markdown to three bullet points. The result is posted to a Slack channel as a formatted digest card. Runs entirely without a paid news API subscription.
Extract job listings → database
A Schedule Trigger runs every 6 hours. A Code node builds a list of job board search URLs. The AlterLab node scrapes each with a custom JSON schema requesting {title, company, location, salary_range, apply_url} and renderJs: true for JS-rendered boards. A Merge node deduplicates on apply_url, and new records are inserted into Postgres via the Postgres node. A final Webhook node notifies a downstream service of the new batch.
Template library