Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API Key error #172

Open
BodapatiNirupamasai opened this issue Oct 3, 2024 · 8 comments
Open

API Key error #172

BodapatiNirupamasai opened this issue Oct 3, 2024 · 8 comments

Comments

@BodapatiNirupamasai
Copy link

litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

@Tlaloc-Es
Copy link

Same here

from langchain_openai import AzureChatOpenAI

azure_chatgpt = AzureChatOpenAI(
    azure_deployment="****",
    api_version="****",
    api_key="****",
    azure_endpoint="https://****.openai.azure.com",    
)

This works perfectly, but when try to set that azue_chatgpt to a agent I got the same error

litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

@jamil-z
Copy link

jamil-z commented Oct 31, 2024

Hello! I don't know if anyone was able to figure out how to connect with CrewAI agents and tools for current versions, like 0.7 and above.

I saw that Azure only works for CrewAI 0.11.

@rao208
Copy link

rao208 commented Nov 6, 2024

from crewai import LLM
import os

os.environ["OTEL_SDK_DISABLED"] = "true"

without setting the "OTEL_SDK_DISABLED" to true I was getting the below error

"requests.exceptions.ConnectTimeout: HTTPConnectionPool(host='telemetry.crewai.com', port=4318): Max retries
exceeded with url: /v1/traces (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at
0x7f34caf38280>, 'Connection to telemetry.crewai.com timed out. (connect timeout=10)')"

config = configparser.ConfigParser()
config.read("config.ini")

OpenAI35kTurbo = LLM(
model=f"azure/{config["AZURE_OPEN_AI"]["DEPLOYMENT_NAME"]}",
api_version=config["AZURE_OPEN_AI"]["AZURE_OPENAI_API_VERSION"],
api_key=config["AZURE_OPEN_AI"]["AZURE_OPENAI_API_KEY"],
base_url= config["AZURE_OPEN_AI"]["BASE_URL"]
)

Your base_url will be https://${INSTANCE_NAME}.openai.azure.com. Basically it is expecting an endpoint for base_url parameter.

This works perfectly well for me.

Please notice azure in model=f"azure/{config["AZURE_OPEN_AI"]["DEPLOYMENT_NAME"]}". That is important.

@topratiksharma
Copy link

This below worked for me as well.

latest crew-ai doesn't have correct details for the setup, the below setup works for me.

import os
from crewai import LLM

def getLLM():
    return LLM(
        model="azure/modelname",
        api_version="version",
        api_key="key",
        base_url="base-url"
    )

@pranjalagg
Copy link

I am encountering the same error. I am trying to use Google Gemini, and the above methods do not work for me. I would appreciate any help. Thanks!

@topratiksharma
Copy link

topratiksharma commented Nov 14, 2024

I am encountering the same error. I am trying to use Google Gemini, and the above methods do not work for me. I would appreciate any help. Thanks!

Can you post your error message, Along with your setup. It will help looking into the issue.

@pranjalagg
Copy link

Sure,

LLM:
llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash",google_api_key=GEM_API)

Agent:

agent1 = Agent(
    role="Communication Analyst",
    goal="Understand the definitions of the classes provided and make a rational decision on how to classify any conversation segment into classes",
    backstory=AGENT1_BACKSTORY,
    LLM=llm,
    api_key=GEM_API
)

Output when the above runs: Checking env var OPENAI_API_KEY: None

... code continues ...

crew.kickoff()
Output:
ERROR:root:LiteLLM call failed: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
WARNING:opentelemetry.trace:Overriding of current TracerProvider is not allowed

@Yash-Bavishi
Copy link

Yash-Bavishi commented Nov 20, 2024

@pranjalagg

llm attribute of Agent class expects a object of LLM class from crewai rather than langchain_google_genai

I would request you to try the below code

# Imports LLM class from crewai
from crewai import LLM, Agent

# LLM Object from crewai package
llm = LLM(mode="gemini-1.5-flash",api_key=GEM_API)

# create your agent
agent1 = Agent(
    role="Communication Analyst",
    goal="Understand the definitions of the classes provided and make a rational decision on how to classify any conversation segment into classes",
    backstory=AGENT1_BACKSTORY,
    llm=llm,
    api_key=GEM_API
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants