Full reference for the SEOgent CLI (npm install -g seogent). The CLI is a lightweight wrapper around the SEOgent REST API — every command maps to an API endpoint. All output is structured JSON to stdout; human-readable messages go to stderr and can be silenced with --quiet.
Install globally: npm install -g seogent (requires Node.js 18+). Need an API token? See the Getting Started guide for installation, token creation, and first-scan walkthrough. For direct HTTP API usage, see the API Reference.
seogent auth <token>Save your API token to ~/.seogent/config.json.
seogent auth sk_your_api_key_here
seogent auth --showDisplay the saved token (masked).
seogent auth --show
seogent auth --removeRemove the saved token.
seogent auth --remove
export SEOGENT_API_KEY=sk_your_api_key_here
Priority order: --api-key flag > SEOGENT_API_KEY env var > config file.
seogent scan <url>Start an SEO scan. Returns a scan_id immediately.
seogent scan https://example.com
Flags:
| Flag | Type | Description |
|---|---|---|
--urls <url> |
string | Additional URLs to scan (repeatable) |
--mode <mode> |
string | Crawl mode: discover (default) follows sitemap and links |
--max-pages <n> |
number | Maximum pages to scan (up to 10,000) |
--performance |
boolean | Include Core Web Vitals metrics |
--link-check |
boolean | Check for dead links and broken images |
--a11y |
boolean | Run WCAG accessibility audit on representative pages |
--webhook <url> |
string | URL to POST results to when scan completes |
Examples:
# Scan a domain (discovers pages automatically)
seogent scan https://example.com
# Scan specific URLs only
seogent scan https://example.com/page --urls https://example.com/other
# Limit to 50 pages
seogent scan https://example.com --max-pages 50
# Include Core Web Vitals
seogent scan https://example.com --performance
# Check for broken links
seogent scan https://example.com --link-check
# Run WCAG accessibility audit
seogent scan https://example.com --a11y
# Combine flags
seogent scan https://example.com --performance --link-check --a11y --max-pages 100
Response:
{
"scan_id": "scan_abc123def456",
"status": "pending",
"url": "https://example.com",
"created_at": "2026-03-12T12:00:00Z"
}
seogent status <scan_id>Check the progress of a running scan.
seogent status scan_abc123def456
Response:
{
"scan_id": "scan_abc123def456",
"status": "crawling",
"progress": 45,
"url": "https://example.com",
"created_at": "2026-03-12T12:00:00Z"
}
Status values: pending | crawling | analyzing | completed | failed
seogent results <scan_id>Retrieve full results of a completed scan.
seogent results scan_abc123def456
Flags:
| Flag | Type | Description |
|---|---|---|
--issues-only |
boolean | Only return pages that have issues |
--min-severity <level> |
string | Filter by minimum severity: critical, high, medium, low |
--per-page <n> |
number | Results per page for pagination |
--cursor <cursor> |
string | Cursor for next page of results |
Examples:
# Full results
seogent results scan_abc123def456
# Only pages with issues
seogent results scan_abc123def456 --issues-only
# High severity and above
seogent results scan_abc123def456 --min-severity high
# Paginate through large results
seogent results scan_abc123def456 --per-page 50 --cursor eyJwYWdl...
Response structure:
{
"scan_id": "scan_abc123def456",
"status": "completed",
"url": "https://example.com",
"pages_scanned": 32,
"average_score": 74,
"summary": {
"excellent": 2,
"good": 12,
"needs_work": 15,
"poor": 3
},
"site_checks": {
"checks": [
{
"key": "robots_txt",
"name": "robots.txt",
"status": "passed",
"message": "robots.txt is accessible and valid",
"category": "crawlability"
}
],
"duplicate_titles": {
"count": 2,
"found": true,
"duplicates": [
{
"title": "Example Site",
"pages": [
"https://example.com/about",
"https://example.com/contact"
]
}
]
},
"duplicate_descriptions": {
"count": 0,
"found": false,
"duplicates": []
}
},
"top_issues": [
{ "issue": "Missing canonical tag", "count": 28 },
{ "issue": "Missing Open Graph tags", "count": 30 }
],
"results": {
"data": [
{
"url": "https://example.com/",
"score": 82,
"grade": "B",
"failed_checks": [
"No structured data found",
"Missing Open Graph tags"
],
"warnings": [
"Title could be longer (28 characters, recommend 30-60)"
],
"all_checks": [
{
"name": "Canonical Tag",
"key": "canonical",
"status": "failed",
"message": "No canonical tag found. Add a self-referencing canonical.",
"category": "indexability",
"weight": 8
}
]
}
],
"next_cursor": "eyJwYWdlIjoyLCJsaW1pdCI6NTB9",
"prev_cursor": null,
"per_page": 50
}
}
seogent cancel <scan_id>Cancel a running scan.
seogent cancel scan_abc123def456
seogent scansList your scans (paginated).
seogent scans
seogent scans --page 2
seogent domainsList your scanned domains.
seogent domains
seogent creditsCheck your remaining credit balance.
seogent credits
These flags work with any command.
| Flag | Short | Description |
|---|---|---|
--api-key <key> |
-k |
API key (overrides env and config) |
--api-url <url> |
-u |
API base URL (default: https://seogent.ai) |
--quiet |
-q |
Suppress non-JSON stderr output |
--help |
-h |
Show help for any command |
The CLI outputs the same JSON structures as the REST API. See the API Reference for complete response schemas, field descriptions, and error formats.
| Code | Meaning |
|---|---|
0 |
Success |
1 |
General error |
2 |
Authentication error (missing or invalid key) |
3 |
Not found |
4 |
Rate limited |
# 1. Start the scan
seogent scan https://example.com --quiet
# 2. Poll until completed (repeat as needed)
seogent status scan_abc123 --quiet
# 3. Fetch results
seogent results scan_abc123 --quiet
# Extract just the top issues
seogent results scan_abc123 --quiet | jq '.top_issues'
# Save full results to a file
seogent results scan_abc123 --quiet > results.json
# Count critical issues
seogent results scan_abc123 --min-severity critical --quiet | jq '.results.data | length'
SCAN_ID=$(seogent scan https://example.com --quiet | jq -r '.scan_id')
while true; do
sleep 15
STATUS=$(seogent status $SCAN_ID --quiet | jq -r '.status')
[ "$STATUS" = "completed" ] && break
[ "$STATUS" = "failed" ] && echo "Scan failed" && exit 1
done
CRITICAL=$(seogent results $SCAN_ID --min-severity critical --quiet | jq '[.top_issues[]] | length')
[ "$CRITICAL" -gt 0 ] && echo "$CRITICAL critical issues found" && exit 1
seogent scan https://example.com --webhook https://your-server.com/seo-callback --quiet
The webhook receives a POST request with the full scan results JSON when the scan completes.