-
Notifications
You must be signed in to change notification settings - Fork 252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bedrock client #213
Bedrock client #213
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great. I wish this will be soon merged, I would love to try AdalFlow on Bedrock right now.
Maybe the example should be added to the docs, no?
@chainyo thanks man!! yes sure, I can add docs moreover, I want to add support for embedding models as next step! |
@khalilfiremind Yeah this looks great, need this PR to go through! The use case I am trying to use it for is email attachments which can contain PDFs, docs, images, excel, etc...If you have a tutorial on a multimodal RAG for this that would be great to add using titan embeddings! |
@Hearsch-Jariwala thanks! I will work on adding the embedding in the next few days! 🚀 |
|
||
def __init__( | ||
self, | ||
aws_profile_name=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add type hinting
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also need more detailed instruction on how to set up the env parameters. For now, I dont know what value i need to pass to aws_connection_timeout=None, aws_read_timeout=None,
And why do they matter?
Tokens and profiles are easy to understand
api_kwargs = model_kwargs.copy() | ||
if model_type == ModelType.LLM: | ||
api_kwargs["messages"] = [ | ||
{"role": "user", "content": [{"text": input}]}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
interesting, so all their converse api take a list as input? Why not use "system", and is this because they have multimodal, there can accept "image" as input?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@liyin2015
AWS recommends it for bedrock using Converse operation over InvokeModel when supported, since it unifies the inference request across Amazon Bedrock models and simplifies the management of multi-turn conversations. They mention "If a model has unique inference parameters, you can also pass those unique parameters to the model. Amazon Bedrock doesn't store any text, images, or documents that you provide as content. The data is only used to generate the response." So I assume yes they can accept image as input.
Source:
@@ -58,7 +58,7 @@ anthropic = { version = "^0.31.1", optional = true } | |||
google-generativeai = { version = "^0.7.2", optional = true } | |||
cohere = { version = "^5.5.8", optional = true } | |||
ollama = { version = "^0.2.1", optional = true } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please also add the boto3 into the follow extra section
[tool.poetry.extras] # allow pip install adalflow[openai, groq]
openai = ["openai"]
groq = ["groq"]
anthropic = ["anthropic"]
cohere = ["cohere"]
google-generativeai = ["google-generativeai"]
pgvector = ["pgvector"]
faiss-cpu = ["faiss-cpu"]
sqlalchemy = ["sqlalchemy"]
torch = ["torch"]
ollama = ["ollama"]
datasets = ["datasets"]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@khalilfiremind Thanks for the work, please address the comments, and ideally we should also add some tests.
ok thanks @liyin2015 for the review, I will modify and re-push |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the work, will merge for now.
There are follow ups we need to make this a better experience
- add example code and more explanation in the client docstring
- make botos an optional import and give import error
- be more clear on what models to support
- check with aws bed rock team to optimize the experience.
AWS Bedrock Client has been added.
The user can access LLMs via single API:
The user can access AWS Bedrock visa profile (if SSO is set), or via secret key, access key,
or if the user assuming a role that have access to Bedrock (deploying on AWS lambda), then no need to pass anything, the client will have access automatically.
The client currently does not support asynchronous calls or streaming. These features may be added in future PRs.
Example of usage:
I'm open to any feedback, notes, or suggestions on this PR.