Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,7 @@ Install or disable them dynamically with the `/plugin` command — enabling you
- [documentation-generator](./plugins/documentation-generator)
- [generate-api-docs](./plugins/generate-api-docs)
- [openapi-expert](./plugins/openapi-expert)
- [swarmvault](./plugins/swarmvault)
- [update-claudemd](./plugins/update-claudemd)

### Git Workflow
Expand Down
23 changes: 23 additions & 0 deletions plugins/swarmvault/.claude-plugin/plugin.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
{
"name": "swarmvault",
"version": "0.7.30",
"description": "Local-first RAG knowledge vault. Compile raw sources into a durable markdown wiki with a knowledge graph and hybrid SQLite FTS plus embeddings. Ships an MCP server and slash commands for compile, query, and graph.",
"author": {
"name": "SwarmClaw AI",
"url": "https://swarmvault.ai"
},
"homepage": "https://swarmvault.ai",
"repository": "https://github.com/swarmclawai/swarmvault",
"license": "MIT",
"keywords": [
"rag",
"knowledge-base",
"wiki",
"graph",
"local-first",
"mcp",
"markdown",
"embeddings",
"fts"
]
}
8 changes: 8 additions & 0 deletions plugins/swarmvault/.mcp.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{
"mcpServers": {
"swarmvault": {
"command": "npx",
"args": ["-y", "@swarmvaultai/cli", "mcp"]
}
}
}
21 changes: 21 additions & 0 deletions plugins/swarmvault/LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2026 SwarmClaw AI

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
67 changes: 67 additions & 0 deletions plugins/swarmvault/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# SwarmVault Plugin for Claude Code

[SwarmVault](https://swarmvault.ai) is a local-first RAG knowledge vault. It compiles raw sources (books, notes, transcripts, exports, datasets, slide decks, files, URLs, code) into a durable markdown wiki with a knowledge graph and a hybrid SQLite FTS plus embeddings index.

This plugin wires SwarmVault into Claude Code:

- Registers the SwarmVault MCP server so Claude can search pages, read the graph, and run query/ingest/compile/lint over the current vault.
- Ships the canonical SwarmVault skill so Claude knows when and how to use the vault.
- Adds three slash commands: `/swarmvault:compile`, `/swarmvault:query`, `/swarmvault:graph`.

## Requirements

- Node.js 18 or newer (the MCP server runs via `npx -y @swarmvaultai/cli`).
- A SwarmVault vault directory. If you don't have one yet, `cd` to an empty directory and run `npx -y @swarmvaultai/cli init` or use the `/swarmvault:compile` command which will prompt for `init` first.

## Install

```
/plugin install swarmvault
```

Or from the repository directly while it is still pending in the marketplace:

```
/plugin marketplace add swarmclawai/swarmvault
/plugin install swarmvault@swarmvault
```

## Usage

Start Claude Code **from the vault root**:

```bash
cd ~/my-vault
claude
```

The plugin inherits that working directory, so the MCP server and slash commands operate against the vault without extra configuration.

### Slash commands

- `/swarmvault:compile` — rebuild the wiki, graph, and search index from `raw/`. Accepts the same flags as the CLI (`--approve`, `--max-tokens`, `--commit`).
- `/swarmvault:query <question>` — ask a question against the compiled vault. The answer is saved to `wiki/outputs/` by default.
- `/swarmvault:graph` — start the live graph viewer at a local URL.

### MCP tools

Once installed, the `swarmvault` MCP server is registered automatically. It exposes tools for page search, page reads, source listing, query, ingest, compile, and lint. See the SwarmVault docs at https://swarmvault.ai/docs for the full tool list.

### Skill

The bundled skill tells Claude when to reach for SwarmVault — typically when a project already contains `swarmvault.config.json` or `swarmvault.schema.md`, or when the user asks for durable notes, a knowledge base, or a graph over their sources. Claude will propose running `swarmvault init`, `ingest`, or `compile` when appropriate.

## Changing the vault directory

The MCP server runs in whatever directory Claude Code started in. To point at a different vault, restart Claude Code from that directory. If you need a fixed `cwd` (for example, when using Claude Code inside a monorepo whose root is not the vault), fork this plugin and add `"cwd": "/absolute/path/to/vault"` to the `swarmvault` entry in `.mcp.json`.

## License

MIT. See [LICENSE](./LICENSE).

## Links

- Website: https://swarmvault.ai
- Docs: https://swarmvault.ai/docs
- CLI on npm: https://www.npmjs.com/package/@swarmvaultai/cli
- Source: https://github.com/swarmclawai/swarmvault
25 changes: 25 additions & 0 deletions plugins/swarmvault/commands/compile.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
---
description: Compile the SwarmVault wiki from the current vault's raw sources.
argument-hint: "[--approve] [--max-tokens N]"
---

Run SwarmVault's compile pipeline over the current vault. This rebuilds the generated markdown wiki, knowledge graph, and search index from `raw/` using the rules in `swarmvault.schema.md`.

Before compiling:
1. Confirm the working directory is a SwarmVault vault (has `swarmvault.schema.md` or `swarmvault.config.json` at the root). If it doesn't, run `swarmvault init` first.
2. Read `swarmvault.schema.md` so you understand the vault's naming, categorization, grounding, and freshness rules. Update the schema before recompiling if organization or grounding looks wrong.

Then run:

```bash
swarmvault compile $ARGUMENTS
```

Useful flags:
- `--approve` — stage changes in `state/approvals/` for local review (`swarmvault review list|show|accept|reject`) instead of writing them live.
- `--max-tokens <n>` — cap the generated wiki at a bounded context budget.
- `--commit` — commit `wiki/` and `state/` changes when the vault lives in a git repo.

After compile:
- Summarize what changed in `wiki/` and `state/`.
- If `wiki/graph/report.md` exists, use it before broad repo search on follow-up questions.
23 changes: 23 additions & 0 deletions plugins/swarmvault/commands/graph.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
---
description: Open the SwarmVault graph viewer for the current vault.
---

Launch SwarmVault's live graph viewer so the user can explore the knowledge graph, bookmarklet-clip pages from the browser, and inspect community clusters.

Prerequisites:
1. Confirm the working directory is a SwarmVault vault.
2. If `state/graph.json` does not yet exist, run `swarmvault compile` first so there is a graph to view.

Run:

```bash
swarmvault graph serve
```

This starts a local HTTP server and prints the URL. When the user is done, they can Ctrl-C the process.

Related commands worth offering as follow-ups:
- `swarmvault graph export --html <output>` — export a shareable standalone HTML view.
- `swarmvault graph export --obsidian` — export an Obsidian-friendly view.
- `swarmvault graph blast <target>` — reverse-import impact analysis for a specific page.
- `swarmvault diff` — graph-level change summary against the last committed baseline.
24 changes: 24 additions & 0 deletions plugins/swarmvault/commands/query.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
---
description: Ask a question against the SwarmVault wiki and save the answer to disk.
argument-hint: "<question>"
---

Ask SwarmVault a question about the current vault. By default the answer is saved as a durable page in `wiki/outputs/` so the team can cite it later.

Prerequisites:
1. Confirm the working directory is a SwarmVault vault. If the vault has never been compiled, run `swarmvault compile` first (or the query will only see raw sources).
2. Skim `swarmvault.schema.md` so the question lands on the right grounding and naming conventions.

Run:

```bash
swarmvault query "$ARGUMENTS"
```

Useful flags:
- `--no-save` — skip writing to `wiki/outputs/` for an ephemeral check.
- `--commit` — commit the saved output immediately when the vault lives in a git repo.

After the run:
- Report where the answer was saved (a path under `wiki/outputs/`).
- If the question revealed gaps in the wiki, suggest a follow-up: add a new source, edit the schema, or rerun `swarmvault compile`.
81 changes: 81 additions & 0 deletions plugins/swarmvault/skills/swarmvault/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
---
name: swarmvault
description: "Use SwarmVault when the user needs a local-first knowledge vault that writes durable markdown, graph, search, dashboard, review, and MCP artifacts to disk from books, notes, transcripts, exports, datasets, slide decks, files, URLs, code, and recurring source workflows."
---

# SwarmVault

Use this skill when the user wants a local-first knowledge vault built on the [LLM Wiki](https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f) pattern — three layers (raw sources, wiki, schema) where the LLM maintains a durable wiki between you and raw sources. Also use it when the project already contains `swarmvault.config.json` or `swarmvault.schema.md`.

For onboarding, examples, command references, or troubleshooting, read SwarmVault's bundled docs at https://swarmvault.ai/docs before improvising workflow advice.

## Quick checks

- Work from the vault root.
- If the vault does not exist yet, run `swarmvault init`.
- Use `swarmvault demo --no-serve` when the user wants the fastest zero-config walkthrough before pointing SwarmVault at their own sources.
- Use `swarmvault scan <directory> --no-serve` when the user wants the fastest scratch pass over a local repo or docs tree without manually stepping through init + ingest + compile first.
- Read `swarmvault.schema.md` before compile or query work. It is the vault's operating contract.
- If `wiki/graph/report.md` exists, use it before broad repo search.

## Core loop

1. Initialize a vault with `swarmvault init` when needed.
2. Update `swarmvault.schema.md` before a serious compile. Use it for naming rules, categories, grounding, freshness expectations, and exclusions.
3. Use `swarmvault source add <input>` when the input is a recurring local file, local directory, public GitHub repo root, or docs hub that should stay registered.
4. Ingest one-off inputs with `swarmvault ingest <path-or-url>`, or ingest a whole repo tree with `swarmvault ingest <directory>`. Audio files use `tasks.audioProvider` when configured, and supported YouTube URLs go through direct transcript capture instead of generic URL ingest.
5. Use `swarmvault ingest --guide`, `swarmvault source add --guide`, `swarmvault source reload --guide`, `swarmvault source guide <id>`, or `swarmvault source session <id>` when the human should integrate one source at a time before canonical pages change. Set `profile.guidedIngestDefault: true` in `swarmvault.config.json` to make guided mode the default; use `--no-guide` to override. Profiles using `guidedSessionMode: "canonical_review"` stage approval-queued canonical edits; `insights_only` profiles keep exploratory synthesis in `wiki/insights/`. Use `--review` only for the lighter review-only path.
6. Use `swarmvault inbox import` for capture-style batches, then `swarmvault watch --lint --repo` when the workflow should stay automated. Add `--code-only` when the refresh should stay AST-only and defer non-code semantic re-analysis to a later `compile`. On tracked repos, code-only changes take that faster compile path automatically. Install `swarmvault hook install` when git checkouts and commits should trigger the same repo-aware code-only refresh automatically.
7. Compile with `swarmvault compile`, use `compile --max-tokens <n>` when the generated wiki must stay inside a bounded context budget, or use `compile --approve` when changes should go through the local review queue first.
8. Resolve staged work with `swarmvault review list|show|accept|reject` and `swarmvault candidate list|promote|archive`.
9. Ask questions with `swarmvault query "<question>"`. It saves durable answers into `wiki/outputs/` by default; add `--no-save` only for ephemeral checks. When an embedding provider is configured, query can merge semantic page matches into local search; `search.rerank: true` lets the current `queryProvider` rerank the merged top hits before answering.
10. Use `swarmvault explore "<question>" --steps <n>` for save-first multi-step research loops, or `--format report|slides|chart|image` when the artifact should be presentation-oriented.
11. Run `swarmvault lint` whenever the schema changed, artifacts look stale, or compile/query results drift. Set `profile.deepLintDefault: true` in `swarmvault.config.json` when the advisory deep-lint pass should be the default, and use `--no-deep` when you need a structural-only run. Add `--web` only when deep lint is enabled and a `webSearch.tasks.deepLintProvider` adapter is configured; web evidence is scoped to deep lint and does not change compile or query behavior.
12. Use `swarmvault mcp` when another agent or tool should browse, search, and query the vault through MCP. When this plugin is installed, the SwarmVault MCP server is already registered — the agent can call its tools directly.
13. Use `swarmvault graph blast <target>` when the user wants reverse-import impact analysis, `swarmvault graph serve` when the live workspace or bookmarklet clipper will help, `swarmvault diff` when they need a graph-level change summary against the last committed baseline, or `swarmvault graph export --html <output>` / `graph export --report <output>` when sharing will help. `graph export` also supports `--html-standalone`, `--json`, `--obsidian`, and `--canvas` for lighter or Obsidian-native sharing.

## Working rules

- Prefer changing the schema before re-running compile when organization or grounding is wrong.
- Treat `wiki/` and `state/` as first-class outputs. Inspect them instead of trusting a single chat answer.
- Prefer `wiki/graph/report.md`, `state/graph.json`, and saved wiki pages over ad hoc broad search when they already exist.
- Use `source add` for recurring files, directories, public GitHub repo roots, and docs hubs. Use `ingest` and `add` for deliberate one-off inputs.
- When the vault lives in a git repo, `ingest|compile|query --commit` can commit `wiki/` and `state/` changes immediately after the run.
- The default heuristic provider is a valid local/offline starting point. Add a model provider only when the user wants richer synthesis quality or optional capabilities such as embeddings, vision, image generation, or audio transcription. The recommended fully-local setup is Ollama + Gemma: `ollama pull gemma4` then set `providers.llm` to `{ type: "ollama", model: "gemma4" }` and point `tasks.compileProvider`, `tasks.queryProvider`, and `tasks.lintProvider` at it.
- Audio ingest needs `tasks.audioProvider` to resolve to a provider that exposes `audio` capability. YouTube transcript ingest does not need a provider. Set `graph.communityResolution` when the user wants to pin community clustering instead of using the adaptive default.
- If an OpenAI-compatible backend cannot satisfy structured generation, reduce its declared capabilities instead of forcing every task through it.
- Keep raw sources immutable. Put corrections in schema, new sources, or saved outputs rather than manually rewriting generated provenance.

## Files and artifacts

- `swarmvault.schema.md`: vault-specific compile and query rules.
- `raw/sources/` and `raw/assets/`: canonical source storage.
- `wiki/`: generated pages plus saved outputs.
- `wiki/outputs/source-briefs/`: saved onboarding briefs for managed sources.
- `wiki/outputs/source-sessions/`: resumable guided-session anchors plus question/answer history for one-source-at-a-time integration.
- `wiki/outputs/source-reviews/`: staged source-scoped review pages.
- `wiki/outputs/source-guides/`: staged source-integration guides for one-source-at-a-time workflows.
- `wiki/dashboards/`: recent sources, reading log, timeline, source sessions, source guides, research map, contradiction, and open-question dashboards.
- `state/extracts/`: extracted markdown and JSON sidecars for ingested sources (PDF, Office formats, transcripts, code, config, data files, and more).
- `state/code-index.json`: repo-aware code aliases and local import resolution data.
- `wiki/projects/`: project rollups over canonical pages.
- `wiki/candidates/`: staged concept and entity pages awaiting promotion.
- `state/graph.json`: compiled graph.
- `state/search.sqlite`: local search index.
- `state/sources.json` and `state/sources/<id>/`: managed-source registry entries plus working sync state.
- `state/approvals/`: staged review bundles from `compile --approve`.
- `state/sessions/`: canonical session artifacts for compile, query, explore, lint, watch, review, and candidate actions.
- `state/jobs.ndjson`: watch-mode run log.

## Agent integration

- `swarmvault install --agent claude` installs graph-first rules into the current project for Claude Code.
- `swarmvault mcp` exposes tools and resources for page search, page reads, source listing, query, ingest, compile, and lint. This plugin registers it automatically.

## Defaults to preserve

- Keep raw source material immutable under `raw/`.
- Save useful answers unless the user explicitly wants ephemeral output.
- Prefer reviewable flows such as `compile --approve`, `review`, and `candidate` when a change should not activate silently.
- Treat provider setup as part of serious vault operation. If only `heuristic` is configured, say so clearly.
- When a vault uses the `profile` block in `swarmvault.config.json`, respect it as the deterministic behavior layer. `swarmvault.schema.md` still defines the human intent layer.