Skip to content
This repository has been archived by the owner on Jun 17, 2023. It is now read-only.

Concurrency and exponential backoff. 3.5 turbo 16k #1

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

CarsonStevens
Copy link

I don't have access to gpt-4 unfortunately, so you may want to change those parameters back. I did switch to the newer 3.5 16k context model. I know your retry method works, but I've been using the backoff lib and switched to that for simplicity sake. I also switched to a steaming context for the main conversation. I left your normal chat function for the KB articles and user profile updating though. They should be updating in the background using a threads. Due to streaming I also added the tiktoken lib to calculate the current token usage of the conversations that stream.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant