-
Notifications
You must be signed in to change notification settings - Fork 51
fix(llma): extract model from response for OpenAI stored prompts #395
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
When using OpenAI stored prompts, the model is defined in the OpenAI dashboard rather than passed in the API request. This change adds a fallback to extract the model from the response object when not provided in kwargs. Fixes PostHog/posthog#42861 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3 files reviewed, 2 comments
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
- Add 8 tests covering model extraction from response for stored prompts - Fix utils.py to add 'unknown' fallback for consistency - Bump version to 7.4.1 - Update CHANGELOG.md 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
posthog/ai/utils.py
Outdated
| "$ai_model": kwargs.get("model"), | ||
| "$ai_model": kwargs.get("model") | ||
| or getattr(response, "model", None) | ||
| or "unknown", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure how I feel about putting "unknown" if we don't know the model. This would trigger a cost model match that won't be correct. I'd rather not put anything in the field.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm yeah was wondering about that and was thinking same - best to leave as none
but up above i see
model=kwargs.get("model", "unknown")
so was trying to match that. but maybe im missing something and a reason its not like that in utils.py hmm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check the plugin server if we do anything with "unknown". Otherwise, it's just going to match it to the wrong cost model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
seems like its in streaming its unknown only so think i should actually just match what was there before - will rejig a little to be more minimal
Radu-Raicea
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One remark about "unknown" that you should probably fix.
…ehavior Non-streaming originally returned None when model wasn't in kwargs. Streaming keeps "unknown" fallback as that was the original behavior. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Verifies that non-streaming returns None (not "unknown") when model is not available in kwargs or response, matching original behavior. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Problem
When using OpenAI stored prompts, the model is defined in the OpenAI dashboard rather than passed in the API request. The PostHog AI wrapper only extracts the model from request parameters (
kwargs.get("model")), causing generations to show up without a model and preventing cost calculations.Fixes PostHog/posthog#42861
Changes
utils.pyto fallback toresponse.modelwhen model is not in kwargsopenai.pyandopenai_async.pyto extract model from response chunks