-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(llm-observability): Add LLM conversation rendering #27626
Conversation
Size Change: +8.2 kB (+0.73%) Total Size: 1.13 MB
|
📸 UI snapshots have been updated4 snapshot changes in total. 0 added, 4 modified, 0 deleted:
Triggered by this commit. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Support for darkmode
📸 UI snapshots have been updated126 snapshot changes in total. 0 added, 126 modified, 0 deleted:
Triggered by this commit. |
b75b7e4
to
e71e477
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a few comments, but nothing blocks it
function MessageDisplay({ | ||
role, | ||
content, | ||
additionalKwargs, | ||
isOutput, | ||
}: { | ||
role: CompletionMessage['role'] | ||
content: CompletionMessage['content'] | ||
additionalKwargs: CompletionMessage['additional_kwargs'] | ||
isOutput?: boolean | ||
}): JSX.Element { | ||
const [isRenderingMarkdown, setIsRenderingMarkdown] = useState(!!content) | ||
|
||
const additionalKwargsEntries = additionalKwargs && Object.entries(additionalKwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tool calls can also be a part of a message.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, let's ditch additional_kwargs
if we're consolidating on including tool calls etc. within the message in the SDKs
<JSONViewer | ||
key={key} | ||
name={key} | ||
src={value} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tool_call.function.arguments
is a stringified JSON. It's better to parse the string argument to an object so users can navigate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point, added parsing
📸 UI snapshots have been updated20 snapshot changes in total. 0 added, 20 modified, 0 deleted:
Triggered by this commit. |
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
b31e0aa
to
1c1f959
Compare
1c1f959
to
31cb736
Compare
📸 UI snapshots have been updated1 snapshot changes in total. 0 added, 1 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated2 snapshot changes in total. 0 added, 2 modified, 0 deleted:
Triggered by this commit. |
Changes
Intuitive LLM conversation UI:
Give it a try in the Generations tab.
Added a few stories for various cases, see UI snapshots.