Skip to content

Conversation

@dumko2001
Copy link
Contributor

This PR modifies ChatOllama to explicitly filter out the strict argument before passing parameters to the underlying Ollama client. This prevents a TypeError when using structured outputs (e.g. via ProviderStrategy) that inject this argument.

Fixes #34107

Breaking Changes:

  • None. This is a bug fix that allows previously crashing code to run successfully.

AI Disclaimer:
I used an AI assistant to generate the regression test case for this fix.

@github-actions github-actions bot added integration Related to a provider partner package integration ollama fix labels Nov 27, 2025
@codspeed-hq
Copy link

codspeed-hq bot commented Nov 27, 2025

CodSpeed Performance Report

Merging #34114 will not alter performance

Comparing dumko2001:issue-34107 (b4bb11d) with master (0a6d01e)

Summary

✅ 1 untouched
⏩ 33 skipped1

Footnotes

  1. 33 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

@mdrxy mdrxy changed the title fix(ollama): Filter unsupported 'strict' argument in ChatOllama #34107 fix(ollama): Filter unsupported 'strict' argument in ChatOllama Nov 27, 2025
@mdrxy mdrxy changed the title fix(ollama): Filter unsupported 'strict' argument in ChatOllama fix(ollama): pop unsupported 'strict' argument in ChatOllama Nov 27, 2025
@github-actions github-actions bot added fix and removed fix labels Nov 27, 2025
@ccurme ccurme self-assigned this Dec 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

fix integration Related to a provider partner package integration ollama

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Structured output with OpenAI model from Ollama causes TypeError in langchain 1.1.0

2 participants