Skip to content

Conversation

@andrewm4894
Copy link
Contributor

Problem

When using OpenAI stored prompts, the model is defined in the OpenAI dashboard rather than passed in the API request. The PostHog AI wrapper only extracts the model from request parameters (kwargs.get("model")), causing generations to show up without a model and preventing cost calculations.

Fixes PostHog/posthog#42861

Changes

  • Updated utils.py to fallback to response.model when model is not in kwargs
  • Updated streaming handlers in openai.py and openai_async.py to extract model from response chunks

When using OpenAI stored prompts, the model is defined in the OpenAI
dashboard rather than passed in the API request. This change adds a
fallback to extract the model from the response object when not
provided in kwargs.

Fixes PostHog/posthog#42861

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3 files reviewed, 2 comments

Edit Code Review Agent Settings | Greptile

andrewm4894 and others added 3 commits December 19, 2025 16:05
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
- Add 8 tests covering model extraction from response for stored prompts
- Fix utils.py to add 'unknown' fallback for consistency
- Bump version to 7.4.1
- Update CHANGELOG.md

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@andrewm4894 andrewm4894 changed the title fix: extract model from response for OpenAI stored prompts fix(llma): extract model from response for OpenAI stored prompts Dec 19, 2025
@andrewm4894 andrewm4894 requested a review from a team December 19, 2025 16:17
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
"$ai_model": kwargs.get("model"),
"$ai_model": kwargs.get("model")
or getattr(response, "model", None)
or "unknown",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure how I feel about putting "unknown" if we don't know the model. This would trigger a cost model match that won't be correct. I'd rather not put anything in the field.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm yeah was wondering about that and was thinking same - best to leave as none

but up above i see

model=kwargs.get("model", "unknown")

so was trying to match that. but maybe im missing something and a reason its not like that in utils.py hmm

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check the plugin server if we do anything with "unknown". Otherwise, it's just going to match it to the wrong cost model.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems like its in streaming its unknown only so think i should actually just match what was there before - will rejig a little to be more minimal

Copy link
Member

@Radu-Raicea Radu-Raicea left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One remark about "unknown" that you should probably fix.

andrewm4894 and others added 2 commits December 19, 2025 17:33
…ehavior

Non-streaming originally returned None when model wasn't in kwargs.
Streaming keeps "unknown" fallback as that was the original behavior.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Verifies that non-streaming returns None (not "unknown") when model
is not available in kwargs or response, matching original behavior.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@andrewm4894 andrewm4894 merged commit 14d1d0b into master Dec 21, 2025
15 checks passed
@andrewm4894 andrewm4894 deleted the feat/llma-openai-stored-prompts branch December 21, 2025 21:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug(llma): AI wrappers don't extract model from response when using OpenAI stored prompts

3 participants