diff --git a/docs/self-hosting/govern/environment-variables.md b/docs/self-hosting/govern/environment-variables.md index 05229b8..3d16ec7 100644 --- a/docs/self-hosting/govern/environment-variables.md +++ b/docs/self-hosting/govern/environment-variables.md @@ -190,18 +190,19 @@ To start Plane AI services, set each replica count to `1`: Plane AI supports multiple LLM providers. Configure one or more by adding their API keys. -| Variable | Description | Required | -| -------------------------- | --------------------------------------------------------------- | -------- | -| **OPENAI_API_KEY** | API key for OpenAI models | Optional | -| **CLAUDE_API_KEY** | API key for Anthropic models | Optional | -| **GROQ_API_KEY** | API key for speech-to-text features | Optional | -| **CUSTOM_LLM_ENABLED** | Set to `true` to use a custom LLM with an OpenAI-compatible API | Optional | -| **CUSTOM_LLM_MODEL_KEY** | Identifier key for the custom model | Optional | -| **CUSTOM_LLM_BASE_URL** | Base URL of the custom model's OpenAI-compatible endpoint | Optional | -| **CUSTOM_LLM_API_KEY** | API key for the custom endpoint | Optional | -| **CUSTOM_LLM_NAME** | Display name for the custom model | Optional | -| **CUSTOM_LLM_DESCRIPTION** | Description of the custom model | Optional | -| **CUSTOM_LLM_MAX_TOKENS** | Maximum token limit for the custom model | Optional | +| Variable | Description | Required | +| ------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | -------- | +| **OPENAI_API_KEY** | API key for OpenAI models | Optional | +| **CLAUDE_API_KEY** | API key for Anthropic models | Optional | +| **GROQ_API_KEY** | API key for speech-to-text features | Optional | +| **CUSTOM_LLM_ENABLED** | Set to `true` to enable a custom LLM. Supports OpenAI-compatible endpoints and AWS Bedrock. | Optional | +| **CUSTOM_LLM_PROVIDER** | Backend provider for the custom model. Accepted values: `openai` (default), `bedrock`. | Optional | +| **CUSTOM_LLM_MODEL_KEY** | Identifier key for the custom model (e.g. a model ID or name). | Optional | +| **CUSTOM_LLM_BASE_URL** | Base URL of the custom model's OpenAI-compatible endpoint. Required when `CUSTOM_LLM_PROVIDER=openai`. | Optional | +| **CUSTOM_LLM_API_KEY** | API key for authenticating with the custom endpoint. Required for `openai` provider; used as the AWS access key ID when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional | +| **CUSTOM_LLM_AWS_REGION** | AWS region for the Bedrock model (e.g. `us-east-1`). Required when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional | +| **CUSTOM_LLM_NAME** | Display name for the custom model shown in the UI. Defaults to `Custom LLM`. | Optional | +| **CUSTOM_LLM_MAX_TOKENS** | Maximum token limit for the custom model. Defaults to `128000`. | Optional | #### Provider base URLs diff --git a/docs/self-hosting/govern/plane-ai.md b/docs/self-hosting/govern/plane-ai.md index ca96c4b..d12ad62 100644 --- a/docs/self-hosting/govern/plane-ai.md +++ b/docs/self-hosting/govern/plane-ai.md @@ -42,10 +42,15 @@ You can provide API keys for both OpenAI and Anthropic, making all models availa #### Custom models (self-hosted or third-party) -Plane AI works with any model exposed through an OpenAI-compatible API, including models served by Ollama, Groq, Cerebras, and similar runtimes. You can configure one custom model alongside your public provider keys. +Plane AI supports custom models through two backends: -:::warning -For reliable performance across all Plane AI features, use a custom model with at least 100 billion parameters. Larger models produce better results. +- **OpenAI-compatible endpoint** — any model exposed via an OpenAI-compatible API, including models served by Ollama, Groq, Cerebras, and similar runtimes. +- **AWS Bedrock** — models accessed directly through Amazon Bedrock using your AWS credentials. + +One custom model can be configured alongside your public provider keys. + +::: warning +The custom model should have at least 100 billion parameters for all Plane AI features to work reliably. Larger, more capable models yield better results. ::: ### Embedding models @@ -109,20 +114,22 @@ CLAUDE_API_KEY=xxxxxxxxxxxxxxxx ### Custom model -Use this for self-hosted models or third-party OpenAI-compatible endpoints. - ```bash CUSTOM_LLM_ENABLED=true +CUSTOM_LLM_PROVIDER=openai # or 'bedrock' CUSTOM_LLM_MODEL_KEY=your-model-key -CUSTOM_LLM_BASE_URL=http://your-endpoint/v1 CUSTOM_LLM_API_KEY=your-api-key CUSTOM_LLM_NAME=Your Model Name -CUSTOM_LLM_DESCRIPTION="Optional description" CUSTOM_LLM_MAX_TOKENS=128000 ``` -:::info -The custom endpoint must expose an OpenAI-compatible API matching OpenAI's request and response format. +**Additional required variables by provider:** + +- **OpenAI-compatible** (`openai`): `CUSTOM_LLM_BASE_URL` +- **AWS Bedrock** (`bedrock`): `CUSTOM_LLM_AWS_REGION` + +::: warning +For Bedrock, the IAM user must have `bedrock:InvokeModel` permission on the target model. ::: ### Speech-to-text (optional)