-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Display all AI SDK LLM message in pretty mode #3727
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Display all AI SDK LLM message in pretty mode #3727
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds support for displaying arrays of LLM messages (from AI SDKs like OpenAI, LangGraph, and OpenAI Agents) in a formatted "pretty mode" with collapsible sections. Previously, these multi-message conversations were displayed as plain text or simple concatenations.
Key changes:
- Added
convertLLMMessagesToMarkdown()function to convert LLM message arrays into formatted markdown with collapsible HTML details elements - Enhanced detection logic to identify LLM message arrays and route them to the new formatter
- Tool call messages are now collapsed by default while other messages remain expanded
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
apps/opik-frontend/src/lib/conversationMarkdown.ts |
Adds new convertLLMMessagesToMarkdown() function and LLMMessage interface for formatting message arrays |
apps/opik-frontend/src/lib/prettifyMessage.ts |
Adds detection functions (isLLMMessagesObject, extractLLMMessagesArray) and integrates new markdown conversion into prettifyMessage() |
apps/opik-frontend/src/lib/traces.test.ts |
Updates tests to verify collapsible HTML output instead of plain text concatenation |
3350664 to
d700c19
Compare
2161710 to
850f9fe
Compare
|
Hi @dsblank @vincentkoc @claude @Lothiraldan , please review this PR and suggest me changes. |
|
Hey @sonianuj287 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| export interface LLMMessage { | ||
| role?: string; | ||
| type?: string; | ||
| content: string | unknown[]; | ||
| tool_calls?: ToolCall[]; | ||
| tool_call_id?: string; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe you could rename this interface because it conflicts with the one in opik/apps/opik-frontend/src/types/llm.ts

Details
Change checklist
Issues
#3416
New Function: convertLLMMessagesToMarkdown
Added to conversationMarkdown.ts
Converts arrays of LLM messages to markdown with collapsible sections
Supports multiple message formats (OpenAI, LangGraph, OpenAI Agents)
Enhanced prettifyMessage Function
Added logic to detect arrays of LLM messages (multiple messages)
Uses the new markdown conversion for multi-message arrays
Maintains backward compatibility for single messages
Helper Functions
isLLMMessagesObject: Detects objects containing LLM message arrays
extractLLMMessagesArray: Extracts messages from various object formats
Refined isConversationObject to avoid conflicts
Testing
Documentation