Skip to content

Commit ff0f88a

Browse files
author
Nicholas Cecere
committed
v0.3.12: Add AWS cross-account parameters and comprehensive documentation update
- Add aws_session_name and aws_role_name parameters to model resource - Comprehensive documentation overhaul with consolidated examples - Update vector store documentation to reflect only officially supported LiteLLM providers - Clean up project structure by removing scattered example files - Update provider source references and enhance all resource documentation
1 parent b5bbdb4 commit ff0f88a

19 files changed

+639
-822
lines changed

CHANGELOG.md

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,34 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
77

88
## [Unreleased]
99

10+
## [0.3.12] - 2025-08-13
11+
12+
### Added
13+
- **New AWS Parameters**: Added `aws_session_name` and `aws_role_name` to model resource for cross-account access scenarios
14+
- Support for AWS session names in cross-account access configurations
15+
- Support for AWS IAM role names for cross-account access
16+
- Enhanced AWS Bedrock integration capabilities
17+
18+
### Changed
19+
- **Documentation Overhaul**: Comprehensive update to all provider documentation
20+
- Updated provider source references from `bitop/litellm` to `registry.terraform.io/ncecere/litellm`
21+
- Consolidated all scattered example files into organized documentation structure
22+
- Enhanced all resource documentation with multiple real-world examples
23+
- Added comprehensive cross-resource integration examples
24+
- **Vector Store Documentation**: Updated to reflect only officially supported LiteLLM providers
25+
- Removed unsupported providers (Pinecone, Weaviate, Chroma, Qdrant, Milvus, FAISS)
26+
- Added accurate examples for supported providers: AWS Bedrock Knowledge Bases, OpenAI Vector Stores, Azure Vector Stores, Vertex AI RAG Engine, PG Vector
27+
- Updated provider-specific parameters with correct configurations
28+
- Added references to official LiteLLM documentation
29+
- **Project Organization**: Cleaned up project structure
30+
- Removed scattered example files from root directory
31+
- Consolidated all examples into comprehensive documentation
32+
- Updated README.md to reflect current capabilities and structure
33+
34+
### Fixed
35+
- Corrected vector store provider documentation to match LiteLLM's official capabilities
36+
- Updated all documentation links and references for accuracy
37+
1038
## [0.3.11] - 2025-08-10
1139

1240
### Added

README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -130,6 +130,7 @@ For full details on the <code>litellm_key</code> resource, see the [key resource
130130
- <code>litellm_model</code>: Manage model configurations. [Documentation](docs/resources/model.md)
131131
- <code>litellm_team</code>: Manage teams. [Documentation](docs/resources/team.md)
132132
- <code>litellm_team_member</code>: Manage team members. [Documentation](docs/resources/team_member.md)
133+
- <code>litellm_team_member_add</code>: Add multiple members to teams. [Documentation](docs/resources/team_member_add.md)
133134
- <code>litellm_key</code>: Manage API keys. [Documentation](docs/resources/key.md)
134135
- <code>litellm_mcp_server</code>: Manage MCP (Model Context Protocol) servers. [Documentation](docs/resources/mcp_server.md)
135136
- <code>litellm_credential</code>: Manage credentials for secure authentication. [Documentation](docs/resources/credential.md)
@@ -212,6 +213,7 @@ This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENS
212213
## Notes
213214

214215
- Always use environment variables or secure secret management solutions to handle sensitive information like API keys and AWS credentials.
215-
- Refer to the `examples/` directory for more detailed usage examples.
216+
- Refer to the comprehensive documentation in the `docs/` directory for detailed usage examples and configuration options.
216217
- Make sure to keep your provider version updated for the latest features and bug fixes.
217-
- v0.2.3 introduces support for the <code>reasoning_effort</code> attribute in the <code>litellm_model</code> resource. This attribute accepts "low", "medium", or "high" to control the model's reasoning effort.
218+
- The provider now supports AWS cross-account access with `aws_session_name` and `aws_role_name` parameters in the model resource.
219+
- All example configurations have been consolidated into the documentation for better organization and maintenance.

docs/index.md

Lines changed: 62 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,42 +1,64 @@
11
# LiteLLM Provider
22

3-
The LiteLLM provider allows Terraform to manage LiteLLM resources. LiteLLM is a proxy service that standardizes the input/output across different LLM APIs.
3+
The LiteLLM provider allows Terraform to manage LiteLLM resources. LiteLLM is a proxy service that standardizes the input/output across different LLM APIs, providing a unified interface for various language model providers.
44

55
## Example Usage
66

77
```hcl
88
terraform {
99
required_providers {
1010
litellm = {
11-
source = "bitop/litellm"
12-
version = "~> 0.2.3"
11+
source = "registry.terraform.io/ncecere/litellm"
1312
}
1413
}
1514
}
1615
1716
provider "litellm" {
18-
api_base = "http://your-litellm-instance:4000"
19-
api_key = "your-api-key"
17+
api_base = "https://your-litellm-proxy.com"
18+
api_key = var.litellm_api_key
2019
}
2120
22-
# Example Model Configuration
23-
24-
```hcl
25-
resource "litellm_model" "example" {
21+
# Basic model configuration
22+
resource "litellm_model" "gpt4" {
2623
model_name = "gpt-4-proxy"
2724
custom_llm_provider = "openai"
2825
model_api_key = var.openai_api_key
29-
model_api_base = "https://api.openai.com/v1"
3026
base_model = "gpt-4"
3127
tier = "paid"
3228
mode = "chat"
33-
reasoning_effort = "medium" # Optional: "low", "medium", or "high"
3429
3530
input_cost_per_million_tokens = 30.0
3631
output_cost_per_million_tokens = 60.0
3732
}
33+
34+
# Team configuration
35+
resource "litellm_team" "dev_team" {
36+
team_alias = "development-team"
37+
models = [litellm_model.gpt4.model_name]
38+
max_budget = 100.0
39+
}
3840
```
3941

42+
## Available Resources
43+
44+
The LiteLLM provider supports the following resources:
45+
46+
* [`litellm_model`](./resources/model) - Manage LiteLLM model configurations
47+
* [`litellm_team`](./resources/team) - Manage teams and their permissions
48+
* [`litellm_team_member`](./resources/team_member) - Manage team member configurations
49+
* [`litellm_team_member_add`](./resources/team_member_add) - Add members to teams
50+
* [`litellm_key`](./resources/key) - Manage API keys
51+
* [`litellm_mcp_server`](./resources/mcp_server) - Manage MCP (Model Context Protocol) servers
52+
* [`litellm_credential`](./resources/credential) - Manage credentials for various providers
53+
* [`litellm_vector_store`](./resources/vector_store) - Manage vector stores
54+
55+
## Available Data Sources
56+
57+
The LiteLLM provider supports the following data sources:
58+
59+
* [`litellm_credential`](./data-sources/credential) - Retrieve credential information
60+
* [`litellm_vector_store`](./data-sources/vector_store) - Retrieve vector store information
61+
4062
## Authentication
4163

4264
The LiteLLM provider requires an API key and base URL for authentication. These can be provided in the provider configuration block or via environment variables.
@@ -46,9 +68,38 @@ The LiteLLM provider requires an API key and base URL for authentication. These
4668
- `LITELLM_API_BASE` - The base URL of your LiteLLM instance
4769
- `LITELLM_API_KEY` - Your LiteLLM API key
4870

71+
### Example with Environment Variables
72+
73+
```bash
74+
export LITELLM_API_BASE="https://your-litellm-proxy.com"
75+
export LITELLM_API_KEY="your-api-key"
76+
```
77+
78+
```hcl
79+
terraform {
80+
required_providers {
81+
litellm = {
82+
source = "registry.terraform.io/ncecere/litellm"
83+
}
84+
}
85+
}
86+
87+
# Provider will automatically use environment variables
88+
provider "litellm" {}
89+
```
90+
4991
## Provider Arguments
5092

5193
The following arguments are supported in the provider block:
5294

5395
* `api_base` - (Required) The base URL of your LiteLLM instance. This can also be provided via the `LITELLM_API_BASE` environment variable.
5496
* `api_key` - (Required) The API key used to authenticate with LiteLLM. This can also be provided via the `LITELLM_API_KEY` environment variable.
97+
98+
## Getting Started
99+
100+
1. Install the provider by adding it to your Terraform configuration
101+
2. Configure your LiteLLM instance URL and API key
102+
3. Start creating resources like models, teams, and credentials
103+
4. Use data sources to reference existing configurations
104+
105+
For detailed examples and configuration options, see the individual resource and data source documentation pages.

docs/resources/credential.md

Lines changed: 69 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ Manages a LiteLLM credential for storing sensitive authentication information. C
1212

1313
## Example Usage
1414

15+
### Basic OpenAI Credential
16+
1517
```terraform
1618
resource "litellm_credential" "openai_cred" {
1719
credential_name = "openai-api-key"
@@ -20,32 +22,55 @@ resource "litellm_credential" "openai_cred" {
2022
credential_info = {
2123
provider = "openai"
2224
region = "us-east-1"
25+
purpose = "chat-completions"
26+
}
27+
28+
credential_values = {
29+
api_key = var.openai_api_key
30+
org_id = var.openai_org_id
31+
}
32+
}
33+
```
34+
35+
### Anthropic Credential
36+
37+
```terraform
38+
resource "litellm_credential" "anthropic_cred" {
39+
credential_name = "anthropic-api-key"
40+
41+
credential_info = {
42+
provider = "anthropic"
43+
purpose = "text-generation"
2344
}
2445
2546
credential_values = {
26-
api_key = "sk-..."
27-
org_id = "org-..."
47+
api_key = var.anthropic_api_key
2848
}
2949
}
3050
```
3151

32-
## Example Usage with Vector Store
52+
### Pinecone Vector Store Credential
3353

3454
```terraform
3555
resource "litellm_credential" "pinecone_cred" {
36-
credential_name = "pinecone-api-key"
56+
credential_name = "pinecone-production"
3757
3858
credential_info = {
39-
provider = "pinecone"
59+
provider = "pinecone"
4060
environment = "production"
61+
region = "us-east-1"
4162
}
4263
4364
credential_values = {
44-
api_key = "your-pinecone-api-key"
45-
index_name = "your-index-name"
65+
api_key = var.pinecone_api_key
66+
index_name = "document-embeddings"
4667
}
4768
}
69+
```
70+
71+
### Using Credentials with Vector Store
4872

73+
```terraform
4974
resource "litellm_vector_store" "example" {
5075
vector_store_name = "my-vector-store"
5176
custom_llm_provider = "pinecone"
@@ -60,6 +85,43 @@ resource "litellm_vector_store" "example" {
6085
}
6186
```
6287

88+
### Multiple Provider Credentials
89+
90+
```terraform
91+
# AWS Bedrock credential
92+
resource "litellm_credential" "aws_bedrock" {
93+
credential_name = "aws-bedrock-cred"
94+
95+
credential_info = {
96+
provider = "aws"
97+
service = "bedrock"
98+
region = "us-east-1"
99+
}
100+
101+
credential_values = {
102+
aws_access_key_id = var.aws_access_key_id
103+
aws_secret_access_key = var.aws_secret_access_key
104+
aws_region = "us-east-1"
105+
}
106+
}
107+
108+
# Azure OpenAI credential
109+
resource "litellm_credential" "azure_openai" {
110+
credential_name = "azure-openai-cred"
111+
112+
credential_info = {
113+
provider = "azure"
114+
service = "openai"
115+
}
116+
117+
credential_values = {
118+
api_key = var.azure_openai_key
119+
api_base = var.azure_openai_endpoint
120+
api_version = "2023-12-01-preview"
121+
}
122+
}
123+
```
124+
63125
## Argument Reference
64126

65127
The following arguments are supported:

0 commit comments

Comments
 (0)