Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converse Api Support #11

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Conversation

brianandres2
Copy link
Collaborator

@brianandres2 brianandres2 commented Feb 11, 2025

Added support for converse API for both text and tools

Run it with

llm = BedrockConverseLLM(modelId)
#prompt_message is LLMMessage, inference_config is LLMInferenceConfig
llm.prompt([prompt_message], inference_config)

Ran into some issues with the formatting being alot different from claude. Things of note:

  1. toolChoice field isn't implemented yet. From research it looks like its only used for the Claude and Mistral models and mostly for debugging purposes. From what it looks like, all models that can use tools will automatically use a tool if one is attached and it feels like it needs to. Some documentation on that here and here
  2. Ran into a little formatting problem with 'from_llm_message', translating it from an LLMMessage. I have a temporary solution that only works for text LLM messages (as opposed to toolResult, images, etc.). I think there was talk about changing the LLMMessage class so I didn't look too much into making a work around.
  3. I added more classes to make formatting the request easier - but let me know if its too much. You can mostly see the effect of that in the 'to_request_body' method.
  4. Some things had messy-ish formatting:
    a. Since we used .converse() instead of .invoke_model() I couldn't do 'response["body"].read().decode("UTF-8")'. Making it a little harder to format it into a ConverseResponse.
    b. ConverseToolSpec had some weird formatting. the .get_function_schema(func) only has 'openAI' and 'Claude' options. I took the output from openAI and did some editing to it to get it to work for Converse - this feels a little messy. Similary in the .use() part - formatting wasn't as clean and I manually added {"json": {"result": str(func_out)}} which also felt messy.
  5. Converse's fields are in camelCase - which our linting does not like. I added ignore comments for each field which ended up being alot.
  6. ConverseResponse doesn't include 'ResponseMetadata' and 'metrics' yet.

Other things of note:

  1. Some models have unique-to-themselves parameters that can be added in through the 'additionalModelRequestFields' field. Right now I just have Claude's "top_k" added, but we can add more as wanted. Some documentation on that
  2. To access some models (i.e nova, llama) you have to go to the cross-region interface tab on aws bedrock and get the model ids there. I copied all of them into the "CONVERSE_BEDROCK_MODEL_IDS" dictionary. It was just adding 'us.' infront of the ids.
  3. It looks like it would be fairly easy to add image support

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant