-
-
Notifications
You must be signed in to change notification settings - Fork 6k
Description
Description
When routing across multiple providers using the LiteLLM router, a BadRequestError is thrown if a provider returns a tool call ID that doesn't comply with OpenAI's strict format requirement ([a-zA-Z0-9]{9}, exactly 9 alphanumeric characters).
Root Cause
Some OpenAI-compatible providers (e.g. MiniMax via OpenRouter) return tool call IDs in non-standard formats such as call_function_jlv0n7uyomle_1 — using the function name and an index suffix. These IDs are stored verbatim in the conversation history.
On the next turn, when the router sends this history (including the assistant message with tool_calls[].id and the tool result message with tool_call_id) to a provider that strictly validates the format (e.g. Mistral), the request is rejected:
litellm.BadRequestError: OpenAIException - Tool call id was call_function_jlv0n7uyomle_1 but must be a-z, A-Z, 0-9, with a length of 9.
Because this is a BadRequestError (400), LiteLLM does not retry or fall back — the error surfaces directly to the caller.
Steps to Reproduce
- Configure a router with a
fast-freemodel group containing a MiniMax model (e.g.openai/minimax/minimax-m2.5:freevia OpenRouter) - Make a multi-turn conversation with tool use
- On the second turn (when the tool result is in the message history), route the request to a Mistral model (e.g.
mistral-small-2506) - Observe the
BadRequestError
Expected Behavior
LiteLLM should normalize non-compliant tool call IDs before sending them to a provider, rewriting both the assistant message tool_calls[].id and the corresponding tool result tool_call_id consistently so they remain paired.
Comparison
A similar fix already exists for the Anthropic provider: _sanitize_anthropic_tool_use_id() in litellm/litellm_core_utils/prompt_templates/factory.py sanitizes tool use IDs when converting to Anthropic format. The same principle needs to be applied in the OpenAI/OpenAI-compatible transform_request path.
Fix
PR: #22318