Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated to work with latest langchain API #57

Merged
merged 2 commits into from
Jun 30, 2024

Conversation

HemuManju
Copy link
Collaborator

This PR update the langchain requirement to the latest version (>=0.2.6). The only change required is the requirement of langchain-community package.

Copy link
Member

@jeremymanning jeremymanning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is good overall, but i also get this warning when i submit a query:
Screenshot 2024-06-29 at 8 01 14 PM

I'm using this notebook to test.

i've added a few other minor suggestions too

@@ -77,7 +76,8 @@ def __init__(self, config):
self.llm_models_factory = ModelsFactory()

self.cache = config["cache_config"]["cache"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we ust change this to self.cache = False for now to be extra safe?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that would work!

@@ -153,7 +153,7 @@ def gpt(self, inputs, prompt):
output : str
The GPT model output in markdown format.
"""
# TODO: Should we create the chain every time? Only prompt is chainging not the model
# TODO: Should we create the chain every time? Only prompt is changing not the model
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i've added an issue for this

---
{text}
---

I'd like the correct answer to be either "[A]", "[B]", "[C]", or "[D]". Can you make up a multiple choice question for me so that I can make sure I really understant the most important concepts? Remember that I can't respond to you, so just ask me to "think about" which choice is correct or something else like that (i.e., without explicitly responding to you). Put two line breaks ("<br>") between each choice so that it appears correctly on my screen. In other words, there should be two line breaks between each of [B], [C], and [D].
I'd like the correct answer to be either "[A]", "[B]", "[C]", or "[D]". Can you make up a multiple choice question for me so that I can make sure I really understand the most important concepts? Remember that I can't respond to you, so just ask me to "think about" which choice is correct or something else like that (i.e., without explicitly responding to you). Put two line breaks ("<br>") between each choice so that it appears correctly on my screen. In other words, there should be two line breaks between each of [B], [C], and [D].
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice catch!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually: we'll need to update chatify-server for this change to actually take effect. but still good to make this change.

@@ -11,15 +11,14 @@
history = history_file.read()

requirements = [
"gptcache<=0.1.35",
"langchain<=0.0.226",
"langchain>=0.2.6",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be strictly greater than?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or...do we even need to pin a version?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with you, no need to pin the version.

@jeremymanning jeremymanning merged commit 9ae2668 into ContextLab:main Jun 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants