Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bedrock client #213

Merged
merged 5 commits into from
Oct 6, 2024
Merged

Bedrock client #213

merged 5 commits into from
Oct 6, 2024

Conversation

khalilfiremind
Copy link
Contributor

@khalilfiremind khalilfiremind commented Sep 15, 2024

AWS Bedrock Client has been added.

The user can access LLMs via single API:

  • AWS Titan
  • Claude
  • Cohere
  • LLama
  • Mistral
  • Jamba

The user can access AWS Bedrock visa profile (if SSO is set), or via secret key, access key,

or if the user assuming a role that have access to Bedrock (deploying on AWS lambda), then no need to pass anything, the client will have access automatically.

The client currently does not support asynchronous calls or streaming. These features may be added in future PRs.

Example of usage:

import os
from adalflow.core import Component, Generator, Prompt
from adalflow.components.model_client import BedrockAPIClient

PROFILE_NAME = os.getenv("PROFILE_NAME")
REGION_NAME = os.getenv("REGION_NAME",)

class SimpleQA(Component):
    def __init__(self):
        super().__init__()
        template = r"""<SYS>
        You are a helpful assistant.
        </SYS>
        User: {{input_str}}
        You:
        """
        self.generator = Generator(
            model_client=BedrockAPIClient(
                aws_profile_name=PROFILE_NAME, 
                aws_region_name=REGION_NAME), 
            model_kwargs={
                "modelId": "anthropic.claude-3-sonnet-20240229-v1:0",
                "inferenceConfig": {
                    "temperature": 0.8
                }
            }, template=template
        )

    def call(self, query):
        return self.generator({"input_str": query})


qa = SimpleQA()
answer = qa("What is LightRAG?")
print(answer)

I'm open to any feedback, notes, or suggestions on this PR.

Copy link

@chainyo chainyo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great. I wish this will be soon merged, I would love to try AdalFlow on Bedrock right now.

Maybe the example should be added to the docs, no?

@khalilfiremind
Copy link
Contributor Author

@chainyo thanks man!!

yes sure, I can add docs

moreover, I want to add support for embedding models as next step!

@Hearsch-Jariwala
Copy link

@chainyo thanks man!!

yes sure, I can add docs

moreover, I want to add support for embedding models as next step!

@khalilfiremind Yeah this looks great, need this PR to go through! The use case I am trying to use it for is email attachments which can contain PDFs, docs, images, excel, etc...If you have a tutorial on a multimodal RAG for this that would be great to add using titan embeddings!

@khalilfiremind
Copy link
Contributor Author

@Hearsch-Jariwala thanks!

I will work on adding the embedding in the next few days! 🚀


def __init__(
self,
aws_profile_name=None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add type hinting

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also need more detailed instruction on how to set up the env parameters. For now, I dont know what value i need to pass to aws_connection_timeout=None, aws_read_timeout=None,

And why do they matter?

Tokens and profiles are easy to understand

api_kwargs = model_kwargs.copy()
if model_type == ModelType.LLM:
api_kwargs["messages"] = [
{"role": "user", "content": [{"text": input}]},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

interesting, so all their converse api take a list as input? Why not use "system", and is this because they have multimodal, there can accept "image" as input?

Copy link

@Hearsch-Jariwala Hearsch-Jariwala Sep 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@liyin2015
AWS recommends it for bedrock using Converse operation over InvokeModel when supported, since it unifies the inference request across Amazon Bedrock models and simplifies the management of multi-turn conversations. They mention "If a model has unique inference parameters, you can also pass those unique parameters to the model. Amazon Bedrock doesn't store any text, images, or documents that you provide as content. The data is only used to generate the response." So I assume yes they can accept image as input.

Source:

@@ -58,7 +58,7 @@ anthropic = { version = "^0.31.1", optional = true }
google-generativeai = { version = "^0.7.2", optional = true }
cohere = { version = "^5.5.8", optional = true }
ollama = { version = "^0.2.1", optional = true }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please also add the boto3 into the follow extra section


[tool.poetry.extras] # allow pip install adalflow[openai, groq]
openai = ["openai"]
groq = ["groq"]
anthropic = ["anthropic"]
cohere = ["cohere"]
google-generativeai = ["google-generativeai"]
pgvector = ["pgvector"]
faiss-cpu = ["faiss-cpu"]
sqlalchemy = ["sqlalchemy"]
torch = ["torch"]
ollama = ["ollama"]
datasets = ["datasets"]

Copy link
Member

@liyin2015 liyin2015 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@khalilfiremind Thanks for the work, please address the comments, and ideally we should also add some tests.

@khalilfiremind
Copy link
Contributor Author

ok thanks @liyin2015 for the review, I will modify and re-push

Copy link
Member

@liyin2015 liyin2015 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work, will merge for now.

There are follow ups we need to make this a better experience

  1. add example code and more explanation in the client docstring
  2. make botos an optional import and give import error
  3. be more clear on what models to support
  4. check with aws bed rock team to optimize the experience.

@liyin2015 liyin2015 merged commit f6e01aa into SylphAI-Inc:main Oct 6, 2024
0 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants