Skip to content

Conversation

@sonianuj287
Copy link

Details

Change checklist

  • User facing
  • Documentation update

Issues

#3416

  • Resolves #
  • OPIK-
    New Function: convertLLMMessagesToMarkdown
    Added to conversationMarkdown.ts
    Converts arrays of LLM messages to markdown with collapsible sections
    Supports multiple message formats (OpenAI, LangGraph, OpenAI Agents)

Enhanced prettifyMessage Function
Added logic to detect arrays of LLM messages (multiple messages)
Uses the new markdown conversion for multi-message arrays
Maintains backward compatibility for single messages

Helper Functions
isLLMMessagesObject: Detects objects containing LLM message arrays
extractLLMMessagesArray: Extracts messages from various object formats
Refined isConversationObject to avoid conflicts

Testing

Documentation

@sonianuj287 sonianuj287 requested a review from a team as a code owner October 20, 2025 09:43
Copilot AI review requested due to automatic review settings October 20, 2025 09:43
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds support for displaying arrays of LLM messages (from AI SDKs like OpenAI, LangGraph, and OpenAI Agents) in a formatted "pretty mode" with collapsible sections. Previously, these multi-message conversations were displayed as plain text or simple concatenations.

Key changes:

  • Added convertLLMMessagesToMarkdown() function to convert LLM message arrays into formatted markdown with collapsible HTML details elements
  • Enhanced detection logic to identify LLM message arrays and route them to the new formatter
  • Tool call messages are now collapsed by default while other messages remain expanded

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.

File Description
apps/opik-frontend/src/lib/conversationMarkdown.ts Adds new convertLLMMessagesToMarkdown() function and LLMMessage interface for formatting message arrays
apps/opik-frontend/src/lib/prettifyMessage.ts Adds detection functions (isLLMMessagesObject, extractLLMMessagesArray) and integrates new markdown conversion into prettifyMessage()
apps/opik-frontend/src/lib/traces.test.ts Updates tests to verify collapsible HTML output instead of plain text concatenation

@sonianuj287 sonianuj287 force-pushed the fix-Display-all-AI-SDK-LLM-message branch from 3350664 to d700c19 Compare October 20, 2025 18:36
@dsblank dsblank added the test-environment Deploy Opik adhoc environment label Oct 21, 2025
@dsblank dsblank force-pushed the fix-Display-all-AI-SDK-LLM-message branch from 2161710 to 850f9fe Compare October 21, 2025 15:13
@sonianuj287
Copy link
Author

Hi @dsblank @vincentkoc @claude @Lothiraldan , please review this PR and suggest me changes.
Thanks :)

@aadereiko
Copy link
Collaborator

aadereiko commented Oct 31, 2025

Hey @sonianuj287
Could you please attach videos showing what exactly has been changed?

Copy link
Collaborator

@aadereiko aadereiko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good work, the code base looks good to go. One comment about UI/UX.

  1. I think we should write Ai with capitalised letters
Image

Comment on lines +225 to +231
export interface LLMMessage {
role?: string;
type?: string;
content: string | unknown[];
tool_calls?: ToolCall[];
tool_call_id?: string;
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe you could rename this interface because it conflicts with the one in opik/apps/opik-frontend/src/types/llm.ts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

test-environment Deploy Opik adhoc environment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants