Skip to content

🐛max_tokens -> max_completion_tokens for newer OpenAI models#65

Open
MFA-X-AI wants to merge 1 commit intomainfrom
fahreza/openai-token-fix
Open

🐛max_tokens -> max_completion_tokens for newer OpenAI models#65
MFA-X-AI wants to merge 1 commit intomainfrom
fahreza/openai-token-fix

Conversation

@MFA-X-AI
Copy link
Copy Markdown
Member

@MFA-X-AI MFA-X-AI commented Apr 1, 2026

Noticed that the OpenAI newer models (gpt-5, o-series) reject max_tokens and require max_completion_tokens . This PR adds OpenAiRequest struct that maps max_tokensmax_completion_tokens at the HTTP boundary, matching how AnthropicProvider already uses a dedicated AnthropicRequest struct.
No change to the shared ChatCompletionRequest used by other providers.

@wmeddie
Copy link
Copy Markdown
Member

wmeddie commented Apr 2, 2026

I need to encode the logic in here properly so that it works with other things. OpenAI is making life difficult with their special snowflake model rules.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants