OpenLinks is a personal, free, open source, version-controlled static website generator for social links.
This project is developer-first, but that does not mean raw JSON should be your default CRUD surface. For most maintainers, the recommended path is:
- Use the repo's AI workflows/skills and checked-in automation docs for repo-native CRUD.
- Use the Studio webapp when the browser-based self-serve editor fits your workflow.
- Drop to direct JSON edits only when you need lower-level control or a manual fallback.
Referral links are now a supported maintainer surface. Prefer the repo-native AI CRUD docs first, and use skills/referral-management/SKILL.md when the work involves reusable referral families, offers, matcher/link shapes, a shared-vs-fork scope decision, or inbox/MCP batch import of referral links from email. The shared catalog in data/policy/referral-catalog.json is the higher-level authoring layer, the optional data/policy/referral-catalog.local.json file is the fork-owned overlay for local-only additions, and links[].referral remains the runtime/render contract with manual overrides. Studio is still a fallback for referral edits only when Advanced JSON is acceptable, and direct JSON edits stay as the lower-level path. The canonical referral contract lives in docs/data-model.md, and the script-backed referral verification checklist lives in docs/social-card-verification.md.
Quick Links are now a shipped renderer behavior: when your top-level data/links.json contains eligible social/profile destinations, OpenLinks automatically derives an icon-first strip above the profile action bar. There is no separate Quick Links registry or authoring workflow yet; see docs/data-model.md for the canonical behavior contract and docs/customization-catalog.md for the current knob inventory.
OpenLinks is also an active upstream dependency for
open-links-sites, a downstream
control repo that builds many individual sites on top of this renderer and data
contract. Maintainers should treat shared schema, policy, script, and
build-output changes here as potentially downstream-visible work. See
docs/downstream-open-links-sites.md
for the current synopsis. The additive links[].referral contract and the
shared data/policy/referral-catalog.json file are both downstream-visible
surfaces; data/policy/referral-catalog.local.json is the fork-owned overlay
side of that split and should stay out of upstream PRs.
- Governance doc: docs/logo-governance.md
- Global primary logo:
public/branding/openlinks-logo/openlinks-logo.svg - Explicit V2 alias:
public/branding/openlinks-logo/openlinks-logo-v2.svg - Archived non-winning V2 marks:
public/branding/openlinks-logo/v2/archive/ - Runtime browser icon set (main app):
public/ - Runtime browser icon set (Studio):
packages/studio-web/public/ - Regenerate runtime brand assets:
bun run branding:assets
OpenLinks generates one canonical site badge asset during dev and build at:
<deployed-origin>/badges/openlinks.svg
Canonical live example for this repo:
Default behavior:
- label is fixed to
OpenLinks - message defaults to
profile.name - optional override lives at
site.sharing.badge.message - set
site.sharing.badge.enabledtofalseto stop publishing the badge
Example override in data/site.json:
{
"sharing": {
"badge": {
"message": "My OpenLinks"
}
}
}Recommended Markdown embed once your site is deployed:
[](https://<your-domain>/)GitHub Pages fork example:
[](https://<owner>.github.io/<repo>/)- Static SolidJS site with minimal runtime complexity.
- Version-controlled content in
data/*.json. - Schema + policy validation with actionable remediation output.
- Rich and simple card support with build-time enrichment.
- Payments and tips links with multi-rail support, styled QR codes, and fullscreen scan mode.
- Build-time profile-avatar materialization with local fallback behavior.
- Build-time rich/SEO image materialization with local-only runtime behavior.
- Offline-friendly public app shell with cached same-origin assets and graceful analytics fallback after first online visit.
- GitHub Actions CI + config-driven production deploy pipeline already wired.
- Checked-in Render and Railway deployment targets for fork-first provider-native hosting.
- Theme and layout controls designed for forking and customization.
- Data-driven typography overrides via
data/site.json(ui.typography).
- Developers who are comfortable editing JSON and markdown.
- Maintainers using AI agents to automate content updates.
- User auth/account system.
- CMS or WYSIWYG editor.
- Traffic analytics or pageview dashboards.
- Plugin marketplace.
For full walkthrough and troubleshooting, see Quickstart.
This repo is already configured for Cursor-managed remote workspaces via .cursor/environment.json.
Remote bootstrap behavior:
- uses a Node 20 base image because Bun is the primary runtime in the managed workspace
- runs
bash .cursor/install.shduring install to pin Bun frompackage.jsonand runbun install - runs
bash .cursor/start.shwhen the remote shell starts - persists
/workspace/.cache,/workspace/node_modules, and Bun's install cache across sessions
Recommended first commands in a remote shell:
bun run validate:data
bun run typecheck
bun run devIf you are working on Studio in a remote shell without Docker, skip docker compose -f docker-compose.studio.yml up --build and use the manual path from docs/studio-self-serve.md:
cp .env.studio.example .env.studio
bun run studio:db:migrate
bun run studio:api:dev
bun run studio:web:dev- Preferred for repo-native maintenance: use the repo's AI workflows/skills through OpenClaw Update/CRUD Contract, OpenClaw Bootstrap Contract, AI-Guided Customization Wizard, Linktree Bootstrap Extractor,
skills/referral-management/SKILL.mdwhen referral catalog families/offers/matchers, fork-vs-upstream catalog scope, or inbox/MCP batch import need an interview-driven workflow,skills/cache-rich-link-assets/SKILL.mdwhen rich-link image assets need to be committed, andskills/openlinks-fork-identity-presence/SKILL.mdwhen you want other websites, repos, docs, or services to point back to your deployed OpenLinks fork. - For referral links: prefer the repo-native AI CRUD docs and the referral-management skill first, including inbox/MCP batch import when the user wants referral codes mined from email; use Studio only when Advanced JSON fits the change, treat
data/policy/referral-catalog.local.jsonas fork-owned overlay data, and usedocs/data-model.mdas the canonical runtime field reference. - For branded payment/tip cards, treat card-shell icon wiring and QR badge wiring as separate checks: shared card chrome follows the known-site icon registry from
links[].icon/payment.rails[].icon, whilebadge.items.assetonly affects the QR center badge. - Preferred for browser-based CRUD: use OpenLinks Studio when the self-serve onboarding/editor already covers your workflow. Referral editing there currently relies on Advanced JSON.
- Manual fallback: edit
data/*.jsondirectly only when you intentionally want the lower-level path or need to work outside the currently supported AI/Studio flows.
The repository currently ships these repo-local skill entrypoints under skills/:
skills/openlinks-fork-identity-presence/SKILL.md: help other websites, apps, repos, docs, and service profiles point back to your deployed OpenLinks fork using its canonical URL and brand assets.skills/referral-management/SKILL.md: interview-driven referral catalog CRUD, inbox/MCP referral import planning, link-levelcatalogRefadoption, fork-local overlay decisions, and upstream-PR hygiene that keepsdata/policy/referral-catalog.local.jsonout of shared diffs.skills/cache-rich-link-assets/SKILL.md: persist rich-card images and related metadata into the committed cache after link changes.skills/create-new-rich-content-extractor/SKILL.md: public-first workflow for deciding and implementing new rich metadata support when existing enrichment is insufficient, including when avatar-first social-profile support should rely on explicitprofileImagecapture versus the shared defaultimage -> profileImagebackfill; not for payment/tip-card logo or QR badge wiring.
Paste this one-liner into OpenClaw, Claude, or Codex (the prompts are mostly compatible with any coding agent):
Follow https://raw.githubusercontent.com/pRizz/open-links/main/docs/openclaw-bootstrap.md exactly for this repository. Execute Required Execution Policy, End-to-End OpenClaw Sequence, Automation and Identity Confirmation Rule, Social Discovery and Inference Contract, Deployment Verification Contract, Structured URL Reporting Schema, README Deploy URL Marker-Block Contract, and Final Output Contract exactly as written. If an existing setup is detected, ask the single route-confirmation and switch to https://raw.githubusercontent.com/pRizz/open-links/main/docs/openclaw-update-crud.md when selected.
Use this path when this is the first setup for a new fork or local clone.
Paste this one-liner into OpenClaw, Claude, or Codex:
Follow https://raw.githubusercontent.com/pRizz/open-links/main/docs/openclaw-update-crud.md exactly for this repository. Execute Required Startup Handshake (including conditional customization-audit selectors), Defaults, Customization Audit Path (Optional), Repository Resolution, Dirty Local Repository Handling, Interaction Modes, Identity and Discovery Policy, Update/CRUD Execution Sequence, Final Output Contract, and Required reason codes exactly as written. When customization_path=customization-audit, use https://raw.githubusercontent.com/pRizz/open-links/main/docs/customization-catalog.md as the checklist source.
Use this path for day-2 maintenance when the user likely already has a fork and/or local clone. If the user just says "help" or uses similarly vague maintenance wording in an existing repo, treat that as a default route into this Update/CRUD contract unless they are clearly asking for first-time bootstrap or runtime/code work.
Use this approach:
- Fork this repository.
git clone <your-repo-url>
cd open-links
bun install
bun run fork:resetbun run fork:reset rewrites the repo to the minimal starter profile, clears inherited upstream identity/caches/badges/history, and empties the README deployment URL rows before you personalize anything.
If you want to inspect what would be cleared first, run:
bun run fork:reset --checkIf you have a Linktree and want a bootstrap seed for profile/avatar/social/content links before editing data/*.json, run:
bun run bootstrap:linktree -- --url https://linktr.ee/<handle>To sync new upstream code/docs/tooling into your fork while preserving fork-owned personalized files, run:
bun run sync:upstreambun run sync:upstream is for forks and downstream repos only. upstream must point at a different repository than origin, typically pRizz/open-links for a fork. It fetches upstream/main, attempts a normal merge first, and only auto-resolves conflicts when every overlapping path is covered by config/fork-owned-paths.json. Shared-file conflicts still stop for manual resolution. The scheduled Upstream Sync GitHub workflow should not run in the canonical pRizz/open-links repo itself.
If your links use authenticated extractors (links[].enrichment.authenticatedExtractor), run guided cache setup before first dev/build:
bun run setup:rich-authIf you use Medium, X, or Primal rich links and want the optional public audience metrics cached locally, run:
bun run public:rich:sync -- --only-link medium
bun run public:rich:sync -- --only-link x
bun run public:rich:sync -- --only-link primalIf you want to refresh the public follower-history artifacts locally before the nightly automation does it on main, run:
bun run followers:history:syncPreferred path:
- use the repo AI workflows/skills and the docs above for routine CRUD
- use OpenLinks Studio when you want the browser-based self-serve path
Manual fallback:
-
edit these files directly when you need the lower-level path
-
data/profile.json- primary entity identity details for a person or organization. -
data/links.json- simple/rich/payment links, groups, ordering. -
data/site.json- theme, UI, quality, and deployment-related config.
Linktree-assisted bootstrap:
- use
bun run bootstrap:linktree -- --url https://linktr.ee/<handle>to generate reviewable entity/link candidates before editingdata/profile.jsonanddata/links.json
Starter-state cleanup:
- on a new fork, run
bun run fork:resetbefore editingdata/profile.json,data/links.json, ordata/site.json - use
bun run fork:reset --checkto preview inherited files/artifacts that will be cleared - if a stale fork is no longer obviously template-like, require
bun run fork:reset --forceonly when you intentionally want to wipe current customized data
Starter presets:
data/examples/minimal/data/examples/grouped/data/examples/invalid/(intentional failures for testing)
bun run validate:data
bun run devbun run build
bun run previewOpenClaw should update only the rows between the exact marker lines below:
- rewrite only marker-bounded rows,
- commit only if normalized URL/status values changed.
- additional optional rows may include
renderandrailwaywhen a fork configures those targets.
OPENCLAW_DEPLOY_URLS_START
| target | status | primary_url | additional_urls | evidence |
|---|---|---|---|---|
| aws | active | https://openlinks.us/ | none | Deploy Production -> Deploy AWS Canonical Site |
| github-pages | active | https://prizz.github.io/open-links/ | canonical=https://openlinks.us/ | Deploy Production -> Deploy GitHub Pages Mirror |
| OPENCLAW_DEPLOY_URLS_END |
If you want an AI agent workflow with explicit checkpoints and manual opt-outs, use the AI-Guided Customization Wizard. For automation-first execution paths, use OpenClaw Bootstrap Contract for first-time setup and OpenClaw Update/CRUD Contract for day-2 changes.
Recommended flow:
- Start with Quickstart.
- Prefer OpenClaw Update/CRUD Contract or AI-Guided Customization Wizard for routine repo-native CRUD.
- Use OpenLinks Studio when you want the browser-based self-serve path.
- Use Data Model and Customization Catalog as the contract/reference layer.
- Use Social Card Verification Guide after changing referral cards, profile-card metadata, follower history, analytics, or share behavior.
- Keep the shared upstream baseline in
config/deployment.defaults.json, and useconfig/deployment.jsononly for fork-specific overrides. - Run setup in apply mode so tracked site metadata and README deploy rows match the topology:
bun run deploy:plan
bun run deploy:setup -- --apply- Push to
main. - Use the matching guide:
- Wait for GitHub CI to pass on
main. - Verify the live target:
- GitHub Pages and AWS through workflow summaries
- Render / Railway with
bun run deploy:verify:live -- --target=<target> --public-origin=<live-url>
Local parity commands:
bun run ci:required
bun run ci:strictThen use:
- Deployment Operations Guide for full troubleshooting and diagnostics flow.
- Render Deployment Guide for the provider-native Render path.
- Railway Deployment Guide for the provider-native Railway path.
- OpenClaw Bootstrap Contract for deployment URL reporting and README marker-block updates.
- OpenClaw Update/CRUD Contract for existing repo update sessions and interaction-mode behavior.
- Linktree Bootstrap Extractor for Linktree-first bootstrap of profile/link candidates.
- Adapter Contract Guide for the current deployment-adapter design and future host planning.
This repository now includes a multi-service self-serve control plane for non-technical onboarding and browser-based CRUD edits:
packages/studio-web- marketing + onboarding + editor (Solid + Tailwind + shadcn-solid style components)packages/studio-api- GitHub auth/app workflows, fork provisioning, content validation/commit, deploy status, sync endpointspackages/studio-worker- scheduled sync trigger workerpackages/studio-shared- shared contracts
See docs/studio-self-serve.md for local setup, Railway deployment, env variables, and GitHub App setup. Track implementation phases in docs/studio-phase-checklist.md.
Studio workspace tooling is Bun-first:
bun installbun run studio:typecheckbun run studio:lintbun run studio:format
High-signal deployment checks:
required-checksjob in.github/workflows/ci.ymlis green.Deploy AWS Sitejob in.github/workflows/deploy-production.ymlis green when AWS is enabled in the effective deployment topology and opted in via GitHub settings.Deploy GitHub Pagesjob in.github/workflows/deploy-production.ymlis green when GitHub Pages is enabled in the effective deployment topology, or intentionally skipped when that target is disabled.Verify Production Deploymentis green for the currently enabled targets.- If deploy fails, review workflow summaries and diagnostics artifacts.
Live build provenance surfaces:
- Footer row:
Built <UTC>plusCommit <shortSha>when commit metadata is available. - JSON endpoint:
/build-info.jsonwithbuiltAtIso,commitSha,commitShortSha, andcommitUrl.
bun run avatar:sync- fetch profile avatar intopublic/cache/profile-avatar/, write the committed manifestdata/cache/profile-avatar.json, and refresh the gitignored runtime overlaydata/cache/profile-avatar.runtime.json.bun run enrich:rich- run non-strict rich metadata enrichment (diagnostic/manual mode) with known-blocker + authenticated-cache policy enforcement; routine runs leavedata/cache/rich-public-cache.jsonunchanged and only update the local runtime overlay when needed.bun run enrich:rich:write-cache- run non-strict rich enrichment and explicitly persist refreshed public metadata intodata/cache/rich-public-cache.json.bun run enrich:rich:strict- run policy-enforced rich metadata enrichment (blocking mode) with known-blocker + authenticated-cache policy enforcement; routine runs leavedata/cache/rich-public-cache.jsonunchanged and only update the local runtime overlay when needed.bun run enrich:rich:strict:write-cache- run policy-enforced rich enrichment and explicitly persist refreshed public metadata intodata/cache/rich-public-cache.json.bun run public:rich:sync- refresh public browser-derived Medium/X/Primal profile audience metrics into the committed stable cache atdata/cache/rich-public-cache.jsonand the local runtime overlay atdata/cache/rich-public-cache.runtime.json(non-auth, operator-invoked).bun run followers:history:sync- append the current follower/subscriber snapshots into the public CSV history files underpublic/history/followers/and refreshpublic/history/followers/index.json.bun run setup:rich-auth- first-run authenticated cache setup (captures only missing/invalid authenticated cache entries).bun run auth:rich:sync- guided authenticated rich-cache capture (updatesdata/cache/rich-authenticated-cache.json+public/cache/rich-authenticated/*).bun run auth:rich:clear- clear authenticated cache entries and unreferenced local assets (selector-driven; supports--dry-run).bun run auth:extractor:new -- --id <id> --domains <csv> --summary "<summary>"- scaffold a new authenticated extractor plugin + policy + registry wiring.bun run linkedin:debug:bootstrap- LinkedIn debug bootstrap (agent-browser checks + browser binary install check).bun run linkedin:debug:login- LinkedIn debug login watcher (autonomous auth-state polling; multi-factor authentication optional).bun run linkedin:debug:validate- LinkedIn authenticated metadata debug validator.bun run linkedin:debug:validate:cookie-bridge- LinkedIn debug validator with cookie-bridge HTTP diagnostic.bun run images:sync- fetch rich-card/SEO remote images into the committed cache atpublic/cache/content-images/, write the stable manifestdata/cache/content-images.json, and refresh the gitignored runtime overlaydata/cache/content-images.runtime.json.- Cache-backed fetches are governed by the committed per-domain registry
data/policy/remote-cache-policy.json. New remote hosts must be added there in the same change batch as link/extractor updates. bun run dev- start local dev server (predev runs strict enrichment in read-only public-cache mode and fails on blocking enrichment policy issues).bun run validate:data- schema + policy checks (standard mode).bun run validate:data:strict- fails on warnings and errors.bun run validate:data:json- machine-readable validation output.bun run build- avatar sync + strict enrichment + content-image sync + validation + production build. The strict enrichment pre-step updates only the local runtime overlay unless you intentionally ran a*:write-cachecommand beforehand;images:syncrefreshes committed content-image cache artifacts when image bytes change.bun run build:strict- avatar sync + strict enrichment + content-image sync + strict validation + build. The strict enrichment pre-step updates only the local runtime overlay unless you intentionally ran a*:write-cachecommand beforehand;images:syncrefreshes committed content-image cache artifacts when image bytes change.bun run preview- serve built output.bun run typecheck- TypeScript checks.
- Workflow:
.github/workflows/nightly-follower-history.yml - Public artifacts:
public/history/followers/*.csvpublic/history/followers/index.json
- Local parity:
bun run enrich:rich:strict:write-cachebun run public:rich:syncbun run followers:history:syncbun run build
- The workflow commits directly to
mainand deploys Pages in the same run. This avoids depending on downstream workflow fan-out from a bot-authored push.
Canonical paths:
data/cache/rich-authenticated-cache.jsonpublic/cache/rich-authenticated/output/playwright/auth-rich-sync/
Setup/refresh flow:
- First-run idempotent setup (only missing/invalid cache entries):
bun run setup:rich-auth - Targeted refresh:
bun run auth:rich:sync -- --only-link <link-id> - Forced refresh (even when cache is already valid):
bun run auth:rich:sync -- --only-link <link-id> --force
Clear flow:
- Dry run clear for one link:
bun run auth:rich:clear -- --only-link <link-id> --dry-run - Apply clear for one link:
bun run auth:rich:clear -- --only-link <link-id> - Apply clear for all authenticated cache entries:
bun run auth:rich:clear -- --all - Recapture after clear:
bun run setup:rich-auth(orbun run auth:rich:sync -- --only-link <link-id>)
bun run quality:check- standard quality gate.bun run quality:strict- strict quality gate.bun run quality:strict:ci- CI strict gate with advisory-only performance findings.bun run quality:json- standard quality JSON report.bun run quality:strict:json- strict quality JSON report.
bun run ci:required- required CI checks.bun run ci:strict- strict CI signal checks with advisory-only performance findings.
Allowed URL schemes:
httphttpsmailtotel
Payment-enabled links and payment rails additionally support:
bitcoinlightningethereumsolana
Use custom for extension metadata:
- top-level
custominprofile,links, andsite - per-link
customin each link object
Unknown top-level keys are allowed but warned. custom keys that collide with core keys fail validation.
For full data model details and examples, see Data Model.
- Re-run
bun run validate:dataand inspect path-specific remediation lines. - Check URL schemes and required fields.
- Move extension fields into
customand avoid reserved-key collisions.
- Re-run with
bun run buildand inspect first failing command output. - If strict mode fails, compare
bun run validate:datavsbun run validate:data:strict. - Re-run blocking enrichment diagnostics:
bun run enrich:rich:strict. - Check canonical blocker registry:
data/policy/rich-enrichment-blockers.json. - Check authenticated extractor policy:
data/policy/rich-authenticated-extractors.json. - Check authenticated cache manifest:
data/cache/rich-authenticated-cache.json. - Review known blocked rich-metadata domains and timestamped attempt history:
docs/rich-metadata-fetch-blockers.md. - Review authenticated extractor architecture/workflow:
docs/authenticated-rich-extractors.md. - Check
site.ui.richCards.enrichmentpolicy (failureMode,failOn,allowManualMetadataFallback) indata/site.json. - If rich-card images look clipped, set
site.ui.richCards.imageFit=contain(or per-link override withlinks[].metadata.imageFit). - If a blocked domain must be tested anyway, set explicit override on that link:
links[].enrichment.allowKnownBlocker=true. - If
authenticated_cache_missingis reported, runbun run setup:rich-auth(orbun run auth:rich:sync -- --only-link <link-id>) and commit cache manifest/assets. - To reset stale/bad authenticated cache data, clear entries first with
bun run auth:rich:clear -- --only-link <link-id>(or--all), then recapture withbun run setup:rich-auth. - If
metadata_missingis blocking, add at least one manual field underlinks[].metadata(title,description, orimage) or remediate remote OG/Twitter metadata. - If a manual or enriched rich-link image changed, run
bun run images:syncand commitdata/cache/content-images.jsonpluspublic/cache/content-images/*when they update. - Temporary emergency bypass (local only):
OPENLINKS_RICH_ENRICHMENT_BYPASS=1 bun run build. - Force-refresh avatar cache when needed:
bun run avatar:sync -- --force(or setOPENLINKS_AVATAR_FORCE=1). - Force-refresh rich/SEO image cache when needed:
bun run images:sync -- --force(or setOPENLINKS_IMAGES_FORCE=1).
- Confirm CI passed on
main. - On a fresh fork, open the Actions tab and click Enable workflows if GitHub says workflows are not being run on the fork. Then push again on
main. - Confirm Pages source is GitHub Actions.
- Check deploy workflow summary for remediation notes.
- Verify base-path settings if publishing from a project page.
- Quickstart
- OpenClaw Bootstrap Contract
- OpenClaw Update/CRUD Contract
- Agent Triage Contract
- Customization Catalog
- Data Model
- Downstream: open-links-sites
- Payment Card Effect Samples
- Rich Metadata Fetch Blockers
- Rich Enrichment Blockers Registry
- Authenticated Rich Extractors
- Create New Rich Content Extractor
- LinkedIn Authenticated Metadata Debug Runbook
- Repo Skill: OpenLinks Fork Identity Presence
- Repo Skill: Cache Rich Link Assets
- Repo Skill: Create New Rich Content Extractor
- AI-Guided Customization Wizard
- Theming and Layout Extensibility
- Deployment Operations Guide
- Adapter Contract Guide
data/- source content JSON and generated artifacts.schema/- JSON schemas.scripts/- validation, enrichment, and quality runners.src/- SolidJS app..github/workflows/- CI and deploy automation..planning/- project planning and phase artifacts.
If extractor workflows helped you build new or improved extractors, kindly consider opening a pull request against https://github.com/pRizz/open-links so everyone can benefit. Feedback on extractor workflows and docs is appreciated.
MIT (see LICENSE).
