An OpenCode plugin that logs raw LLM request bodies to disk for debugging and analysis.
Intercepts outgoing fetch calls to LLM providers (Anthropic, OpenAI, Google, Bedrock, etc.) and writes each request body as a timestamped JSON file. Logs are organized by session ID.
Log location: ~/.config/opencode/logs/request-logger/<sessionId>/<timestamp>_<seq>.json
Each file contains:
{
"timestamp": "2026-03-04T12:00:00.000Z",
"url": "https://api.anthropic.com/v1/messages",
"body": { "...raw request body..." }
}Two interception layers ensure coverage across all providers:
globalThis.fetchwrapper -- catches most providers since the AI SDK routes through fetchchat.paramshook -- wraps the per-request fetch passed tostreamText, catching providers that bypassglobalThis.fetch(e.g. AWS Bedrock)
Only requests with an AI-shaped body (containing messages, input, or contents arrays) are logged.
# From the plugin directory
npm install
npm run buildThen add to your OpenCode config (~/.config/opencode/config.json):
{
"plugin": {
"opencode-request-logger": {
"path": "~/.config/opencode/plugin/opencode-request-logger"
}
}
}