Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: display chat events #52

Merged
merged 14 commits into from
Apr 24, 2024
Merged

feat: display chat events #52

merged 14 commits into from
Apr 24, 2024

Conversation

thucpn
Copy link
Collaborator

@thucpn thucpn commented Apr 17, 2024

Summary by CodeRabbit

  • New Features

    • Introduced enhanced content generation for chat functionalities, handling both text and events.
    • Added new callback event handling classes for improved streaming data management.
    • Implemented collapsible chat events components in the UI, enhancing user interaction.
  • Enhancements

    • Updated streaming data functions and response handling for better performance and flexibility.
  • Dependencies

    • Added aiostream and @radix-ui/react-collapsible to manage asynchronous streams and UI collapsibility, respectively.

@thucpn thucpn requested a review from marcusschiesser April 17, 2024 08:23
Copy link

changeset-bot bot commented Apr 17, 2024

🦋 Changeset detected

Latest commit: eb9a36c

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
create-llama Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link

coderabbitai bot commented Apr 17, 2024

Walkthrough

The recent updates enhance streaming functionalities across multiple platforms, integrating event handling and content generation. FastAPI and Express backends now support more dynamic streaming responses, while Next.js frontends improve user interaction with collapsible UI components for chat events. Additionally, dependencies have been updated to support these new features.

Changes

Files Changes
.../fastapi/app/api/routers/chat.py, .../fastapi/app/api/routers/messaging.py Enhanced streaming and event handling in FastAPI with new classes and refactored generators.
.../fastapi/pyproject.toml Added aiostream dependency.
.../express/src/controllers/..., .../express/src/controllers/stream-helper.ts Updated Express controllers for streaming, moved specific functions to helper file.
.../nextjs/app/api/chat/route.ts, .../nextjs/app/components/ui/chat/..., .../nextjs/app/components/ui/collapsible.tsx, .../nextjs/app/components/ui/index.ts Next.js updates for streaming data handling, new UI components for collapsible chat events.
.../nextjs/package.json Added @radix-ui/react-collapsible.

🐰✨
A hop and a skip in the code, we weave,
New streams flow, as we believe.
Events unfold, in collapsible style,
A chat that dances, mile after mile.
Cheers to changes, big and small,
In our digital burrow, we manage it all! 🎉
🐰✨


Recent Review Details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits Files that changed from the base of the PR and between 046ff06 and eb9a36c.
Files selected for processing (16)
  • .changeset/spicy-moose-allow.md (1 hunks)
  • templates/types/streaming/express/src/controllers/chat.controller.ts (3 hunks)
  • templates/types/streaming/express/src/controllers/llamaindex-stream.ts (2 hunks)
  • templates/types/streaming/express/src/controllers/stream-helper.ts (1 hunks)
  • templates/types/streaming/fastapi/app/api/routers/chat.py (4 hunks)
  • templates/types/streaming/fastapi/app/api/routers/messaging.py (1 hunks)
  • templates/types/streaming/fastapi/pyproject.toml (1 hunks)
  • templates/types/streaming/nextjs/app/api/chat/llamaindex-stream.ts (2 hunks)
  • templates/types/streaming/nextjs/app/api/chat/route.ts (3 hunks)
  • templates/types/streaming/nextjs/app/api/chat/stream-helper.ts (1 hunks)
  • templates/types/streaming/nextjs/app/components/ui/chat/chat-events.tsx (1 hunks)
  • templates/types/streaming/nextjs/app/components/ui/chat/chat-message.tsx (3 hunks)
  • templates/types/streaming/nextjs/app/components/ui/chat/chat-messages.tsx (1 hunks)
  • templates/types/streaming/nextjs/app/components/ui/chat/index.ts (2 hunks)
  • templates/types/streaming/nextjs/app/components/ui/collapsible.tsx (1 hunks)
  • templates/types/streaming/nextjs/package.json (1 hunks)
Files skipped from review as they are similar to previous changes (14)
  • .changeset/spicy-moose-allow.md
  • templates/types/streaming/express/src/controllers/llamaindex-stream.ts
  • templates/types/streaming/express/src/controllers/stream-helper.ts
  • templates/types/streaming/fastapi/app/api/routers/messaging.py
  • templates/types/streaming/fastapi/pyproject.toml
  • templates/types/streaming/nextjs/app/api/chat/llamaindex-stream.ts
  • templates/types/streaming/nextjs/app/api/chat/route.ts
  • templates/types/streaming/nextjs/app/api/chat/stream-helper.ts
  • templates/types/streaming/nextjs/app/components/ui/chat/chat-events.tsx
  • templates/types/streaming/nextjs/app/components/ui/chat/chat-message.tsx
  • templates/types/streaming/nextjs/app/components/ui/chat/chat-messages.tsx
  • templates/types/streaming/nextjs/app/components/ui/chat/index.ts
  • templates/types/streaming/nextjs/app/components/ui/collapsible.tsx
  • templates/types/streaming/nextjs/package.json
Additional Context Used
Ruff (1)
templates/types/streaming/fastapi/app/api/routers/chat.py (1)

6-6: llama_index.core.chat_engine.types.StreamingAgentChatResponse imported but unused

Additional comments not posted (9)
templates/types/streaming/express/src/controllers/chat.controller.ts (6)

1-1: Ensure that all imported modules are used within the file to avoid unnecessary imports.


47-47: Initialization of vercelStreamData is correct and follows the new structure for handling streaming data.


54-60: The callback setup for Settings.callbackManager is correctly implemented to handle streaming data events.


65-65: Ensure that the chatHistory parameter is correctly populated and used within the chat function.


70-70: The integration of LlamaIndexStream with vercelStreamData is correctly implemented to handle streaming responses.


77-77: The piping of the stream through vercelStreamData.stream is correctly set up to handle the response.

templates/types/streaming/fastapi/app/api/routers/chat.py (3)

97-98: The addition of event_handler to chat_engine.callback_manager.handlers is correctly implemented to handle event callbacks.


101-124: The content_generator function is well-refactored to handle both text and event streams. Ensure that the stream management correctly handles client disconnections.


139-139: The return statement in the chat function correctly wraps the generator function for streaming responses.


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

.changeset/spicy-moose-allow.md Outdated Show resolved Hide resolved
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

@thucpn thucpn requested a review from marcusschiesser April 22, 2024 03:24
@thucpn thucpn force-pushed the feat/display-chat-events branch from 939d113 to c194d7b Compare April 22, 2024 04:40
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Actionable comments outside the diff hunks (1)
templates/types/streaming/fastapi/app/api/routers/chat.py (1)

Line range hint 6-6: Remove the unused import StreamingAgentChatResponse to clean up the code.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

@marcusschiesser marcusschiesser force-pushed the feat/display-chat-events branch from 1d74be7 to 16a61cc Compare April 24, 2024 03:14
@leehuwuj leehuwuj force-pushed the feat/display-chat-events branch from 16a61cc to 41d6a66 Compare April 24, 2024 03:33
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Out of diff range and nitpick comments (1)
templates/types/streaming/fastapi/app/api/routers/chat.py (1)

Line range hint 6-6: Remove unused import StreamingAgentChatResponse to clean up the code.

- from llama_index.core.chat_engine.types import StreamingAgentChatResponse

@marcusschiesser marcusschiesser force-pushed the feat/display-chat-events branch from 41d6a66 to eb9a36c Compare April 24, 2024 07:19
@marcusschiesser marcusschiesser merged commit cfb5257 into main Apr 24, 2024
5 checks passed
@marcusschiesser marcusschiesser deleted the feat/display-chat-events branch April 24, 2024 07:21
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Out of diff range and nitpick comments (1)
templates/types/streaming/fastapi/app/api/routers/chat.py (1)

Line range hint 6-6: Remove unused import StreamingAgentChatResponse to clean up the code.

- from llama_index.core.chat_engine.types import StreamingAgentChatResponse

@@ -25,7 +26,7 @@ const convertMessageContent = (

export const chat = async (req: Request, res: Response) => {
try {
const { messages, data }: { messages: ChatMessage[]; data: any } = req.body;
const { messages, data }: { messages: Message[]; data: any } = req.body;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The validation logic for messages and userMessage should be adjusted to check the array before popping an element to prevent runtime errors.

- const userMessage = messages.pop();
- if (!messages || !userMessage || userMessage.role !== "user") {
+ if (!messages || messages.length === 0) return res.status(400).json({ error: "No messages provided in the request." });
+ const userMessage = messages.pop();
+ if (!userMessage || userMessage.role !== "user") {

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
const { messages, data }: { messages: Message[]; data: any } = req.body;
const { messages, data }: { messages: Message[]; data: any } = req.body;
if (!messages || messages.length === 0) return res.status(400).json({ error: "No messages provided in the request." });
const userMessage = messages.pop();
if (!userMessage || userMessage.role !== "user") {

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants