Skip to content

Conversation

@gut-puncture
Copy link

Fix: Preserve extra_content for Gemini Multi-Turn Tool Calls

Fixes #7519

Problem

When using Gemini 3 Pro via the OpenAI compatibility layer, multi-turn tool call conversations fail with:

400 Bad Request: function call is missing a `thought_signature`

Root Cause

Gemini models include a thought_signature within extra_content.google.thought_signature in tool call responses. This signature must be echoed back in subsequent requests for the model to maintain context. The CLI was discarding this field.

Solution

Added support for preserving and echoing extra_content in the Chat Completions API path:

1. protocol/src/models.rs

Added extra_content field to ResponseItem::FunctionCall:

FunctionCall {
    // ... existing fields ...
    #[serde(default, skip_serializing_if = "Option::is_none")]
    extra_content: Option<serde_json::Value>,
}

2. codex-api/src/sse/chat.rs

Capture extra_content from SSE tool call deltas:

struct ToolCallState {
    name: Option<String>,
    arguments: String,
    extra_content: Option<serde_json::Value>,  // NEW
}

// In processing loop:
if let Some(extra) = tool_call.get("extra_content") {
    call_state.extra_content = Some(extra.clone());
}

3. codex-api/src/requests/chat.rs

Include extra_content in outgoing request JSON:

if let Some(extra) = extra_content {
    tool_call_obj["extra_content"] = extra.clone();
}

Backward Compatibility

  • #[serde(default)] - Existing data without extra_content deserializes correctly
  • skip_serializing_if = "Option::is_none" - Output stays clean for non-Gemini providers
  • Rust .. pattern matching - 20+ existing code paths automatically ignore the new field

Testing

cargo test --package codex-protocol
cargo test --package codex-api
cargo test --package codex-core

Related

… tool calls

Gemini models include a thought_signature in extra_content of tool call
responses that must be echoed back in subsequent requests. Without this,
multi-turn conversations with tool calls fail with 400 Bad Request.

Changes:
- Add extra_content field to ResponseItem::FunctionCall in models.rs
- Capture extra_content from SSE tool_calls delta in sse/chat.rs
- Include extra_content in outgoing request JSON in requests/chat.rs

The extra_content field uses #[serde(default)] for backward compatibility
and skip_serializing_if for clean output with non-Gemini providers.
@github-actions
Copy link

github-actions bot commented Dec 4, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@gut-puncture
Copy link
Author

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Dec 4, 2025
@gut-puncture
Copy link
Author

recheck

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

gut-puncture and others added 2 commits December 4, 2025 19:30
Update all call sites that construct ResponseItem::FunctionCall to
include the new extra_content field, restoring buildability.
@etraut-openai etraut-openai added the chat-endpoint Bugs or PRs related to the chat/completions endpoint (wire API) label Dec 4, 2025
@gut-puncture
Copy link
Author

@etraut-openai sir, can we go ahead with the review of this PR or is there any issue here?

@etraut-openai
Copy link
Collaborator

@gut-puncture, the team hasn't had time to review this PR yet. There are others ahead of it in the queue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

chat-endpoint Bugs or PRs related to the chat/completions endpoint (wire API)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Gemini 3 Pro: Multi-turn tool calls fail because thought_signature is not preserved via OpenAI compatibility layer

2 participants