-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Set the initial value of the chat modal to the recently created token #2864
feat: Set the initial value of the chat modal to the recently created token #2864
Conversation
Your org requires the Graphite merge queue for merging into mainAdd the label “flow:merge-queue” to the PR and Graphite will automatically add it to the merge queue when it’s ready to merge. Or use the label “flow:hotfix” to add to the merge queue as a hot fix. You must have a Graphite account and log in to Graphite in order to use the merge queue. Sign up using this link. |
How to use the Graphite Merge QueueAdd either label to this PR to merge it via the merge queue:
You must have a Graphite account in order to use the merge queue. Sign up using this link. An organization admin has required the Graphite Merge Queue in this repository. Please do not merge from GitHub as this will restart CI on PRs being processed by the merge queue. This stack of pull requests is managed by Graphite. Learn more about stacking. |
Coverage report for
|
St.❔ |
Category | Percentage | Covered / Total |
---|---|---|---|
🔴 | Statements | 5.37% (+0.08% 🔼) |
395/7352 |
🔴 | Branches | 4.67% (+0.08% 🔼) |
237/5080 |
🔴 | Functions | 3.22% (+0.06% 🔼) |
78/2422 |
🔴 | Lines | 5.28% (+0.08% 🔼) |
380/7192 |
Test suite run success
124 tests passing in 14 suites.
Report generated by 🧪jest coverage report action from cf3c4b3
bd8f2bc
to
cf3c4b3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Merge activity
|
… token (#2864) resolves #2865 **Changes:** Adds automatic token population in the chat UI modal by passing the most recently created endpoint token to the chat interface. **Details:** - Extends `ChatUIModal` to accept endpoint token fragment data - Adds logic to find the newest token from the endpoint token list - Automatically populates the token field in the chat interface with the most recent token - Enhances user experience by removing manual token entry requirement **How to test:** 1. Click the LLM Chat Test from the serving page. 2. If it is not a public vllm model, you can get the custom modal. 3. If there are tokens, the most recently created and valid token will be set as the placeholder. 4. If there is no valid token, the placeholder is empty. **Checklist:** - [x] Mention to the original issue - [ ] Documentation - [ ] Minium required manager version - [x] Specific setting for review (eg., KB link, endpoint or how to setup) - [ ] Minimum requirements to check during review - [ ] Test case(s) to demonstrate the difference of before/after **Review Requirements:** 1. Verify that the newest token is correctly identified from the endpoint token list 2. Confirm the token field is pre-populated in the chat interface 3. Test that the chat functionality works with the auto-populated token
cf3c4b3
to
cec0a33
Compare
resolves #2865
Changes:
Adds automatic token population in the chat UI modal by passing the most recently created endpoint token to the chat interface.
Details:
ChatUIModal
to accept endpoint token fragment dataHow to test:
Checklist:
Review Requirements: