-
Notifications
You must be signed in to change notification settings - Fork 1.3k
How to Use GPT-4o or Gemini-1.5-pro LLM with the Dockerized Pandas-AI Platform Project #1411
Replies: 1 comment · 2 replies
-
Hey @NathanWu12! 👋 I'm here to help you with any bugs, questions, or contributions while you wait for a human maintainer. Let's dive in and tackle this challenge together! I wasn't able to find specific information about integrating gpt-4o or gemini-1.5-pro with the pandas-ai platform. However, to integrate a new LLM, you would typically need to modify the For guidance on integrating new LLMs, you can refer to the |
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1 -
👎 1
-
I would llike to use the gemini llm. I modified the llm field in pandasai.json and df_config.py, but the log still calls bamboo. Can you help me understand why? @Dusu pandasai.json: df_config.py: from pandasai.constants import DEFAULT_CHART_DIRECTORY from ..llm import LLM, GoogleGemini, LangchainLLM class LogServerConfig(TypedDict): class Config(BaseModel):
Log: |
Beta Was this translation helpful? Give feedback.
All reactions
-
The issue you're encountering, where the log still calls Here are a few things to check and try:
By ensuring that |
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1 -
👎 1
-
I would like to use the gpt-4o or gemini-1.5-pro llm. Could someone please advise which file under the server path should be modified?
Beta Was this translation helpful? Give feedback.
All reactions