Skip to content

Expand web sitemap and robots coverage#623

Closed
noodleonthecape wants to merge 1 commit intomainfrom
seo/web-sitemap
Closed

Expand web sitemap and robots coverage#623
noodleonthecape wants to merge 1 commit intomainfrom
seo/web-sitemap

Conversation

@noodleonthecape
Copy link
Contributor

@noodleonthecape noodleonthecape commented Mar 22, 2026

Summary\n- expand web sitemap generation to enumerate docs, blog posts, and planned OpenClaw use-case URLs\n- update robots.txt to explicitly allow crawling the new /openclaw/use-cases surface\n- keep private invite routes blocked from indexing\n\n## Testing\n- npm run build --workspace web\n - Next.js production build succeeded; final output also reported an existing missing ESLint plugin from the repo root config\n\n## Notes\n- This pairs with the separate use-case landing pages PR so sitemap coverage lands cleanly with the new URLs.\n


Open with Devin

Copy link
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 potential issue.

View 3 additional findings in Devin Review.

Open in Devin Review

Comment on lines +7 to +13
const OPENCLAW_USE_CASE_SLUGS = [
'multi-agent-workflows-claude-codex',
'how-to-let-ai-agents-message-each-other',
'agent-orchestration-for-coding-teams',
'slack-style-messaging-for-ai-agents',
'human-in-the-loop-agent-workflows',
] as const;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Sitemap lists 5 use-case URLs with no corresponding routes (404s)

The OPENCLAW_USE_CASE_SLUGS array generates sitemap entries for /openclaw/use-cases/<slug>, but no pages or routes exist for these paths. There is no /openclaw/use-cases/ directory under web/app/, no dynamic [slug] route, no catch-all route, and no rewrites/redirects in next.config.mjs. All 5 URLs will return 404 to search engine crawlers, which Google Search Console reports as sitemap errors and can negatively impact crawl budget and SEO standing.

The same concern applies to web/app/robots.ts:8 which allows /openclaw/use-cases/ for crawling—but that is less impactful since it just permits crawling of a non-existent path.

Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants