Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented a call interface to LLMModel #24

Merged
merged 4 commits into from
Dec 16, 2024
Merged

Implemented a call interface to LLMModel #24

merged 4 commits into from
Dec 16, 2024

Conversation

maykcaldas
Copy link
Collaborator

This PR implements a call method to LLMModel that input a list of Messages.
Therefore, LLMModel will have a method to process prompts directly (to ensure paper-qa compatibility), run_prompt, and a new method to input a list of Messages, call, which can contain images.

@maykcaldas maykcaldas marked this pull request as ready for review December 16, 2024 19:49
llmclient/llms.py Show resolved Hide resolved
@ludomitch
Copy link

LGTM - just one minor docstring comment

@maykcaldas maykcaldas merged commit db2a226 into main Dec 16, 2024
6 checks passed
@maykcaldas maykcaldas deleted the test_img branch December 16, 2024 23:19
@jamesbraza
Copy link
Contributor

I chatted with Mayk in-person, and I understand the current goal is to consolidate everything to this repo with minimal rewrites ("expand phase").

Now that we've mostly completed this, in LLMModel there's many redundant methods:

  • achat/achat_iter and acomplete/acomplete_iter giving Chunk variants
  • call giving LLMResult
  • run_prompt giving LLMResult

When we do the "contract phase", we should standardize on one way. Here's my thoughts on it:

  • Standardize on "chat" because its role and message history concepts directly align with Message
  • But not name it chat, instead name it call, as llm-client is opinionated towards simplicity
  • Remove Chunk in favor of LLMResult

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants