-
-
Notifications
You must be signed in to change notification settings - Fork 15
Add a Provider in AI Studio
AI Studio is designed to seamlessly integrate a wide range of Large Language Models (LLMs) from various sources, giving you the flexibility to choose the models that best suit your needs. A provider in AI Studio is a combination of a model source and a specific model. The first step in using AI Studio is to add a provider, which makes the LLM available within the app for immediate use.
Currently, AI Studio supports integration with the following sources:
- OpenAI
- Anthropic
- Mistral
- Fireworks
- Self-hosted models
For self-hosted models, AI Studio offers compatibility with tools like LM Studio, llama.cpp, and ollama, allowing you to integrate and prompt your own custom models with ease.
1. Go to Settings
2. Select Configure Providers
3. Add Provider
4. Add a Model
4.1. Add a Cloud Model (Using OpenAI as an Example)
To integrate an OpenAI model, follow these steps:
4.1.1. Select OpenAI as the Provider:
a. Open the Provider
drop-down menu.
b. Select OPEN_AI
from the list of available providers.
4.1.2. Obtain your API key from the OpenAI platform:
a. Log in to the OpenAI platform using your OpenAI credentials.
b. Navigate to Settings
(located in the upper right corner).
c. Click API keys
(on the left-handed side menu)
d. Click + Create new secret key
(upper-right corner)
e. Enter a name for the key (e.g. “AI Studio”) to make it easy to recognize later.
f. Generate the key and copy it.
Important: You will only see the key once. Save it securely (e.g., in a text file or a password manager) before leaving the page.
4.1.3. Authorize the Connection in AI Studio:
a. Enter the copied API key in the API Key
field in AI.
b. When adding OpenAI models, the fields labeled Hostname
and Host
are not required and can be disregarded.
4.1.4. Load, Select, and Configure a Model:
a. Once authenticated, click Load
to get the list of OpenAI models.
b. Use the drop-down menu next to Load
to view the available OpenAI models (e.g., gpt-4o).
c. Select the model you want to add.
d. Enter an Instance Name
to identify the model in your list of models.
4.1.5. Click Add
to save the configured model to your list of providers.
4.2. Add a Self-Hosted Model (Using LM Studio as an Example)
To integrate a self-hosted model from LM Studio, follow these steps:
4.2.1. Select Self-Hosted as the Provider:
a. Open the Provider
drop-down menu.
b. Select SELF_HOSTED
from the list of available providers.
4.2.2. When adding a self-hosted model with LM Studio, the API Key
field is not required and can be disregarded.
4.2.3. Add the Hostname
and Host
:
a. Open LM Studio. For detailed information about LM Studio setup, refer to the official documentation.
b. Navigate to the Developer
environment (found on the left-handed side menu).
c. Select the desired model from the Select a model to load
list (at the upper-center).
d. Once the model is loaded, click Start Server
(upper-left corner).
e. Copy the Hostname
, which is displayed as the local server address in API Usage, typically in the format of an IP address.
f. Paste the copied address in the corresponding Hostname
field in AI Studio.
g. Select LM Studio
in the Host
drop-down menu.
4.2.4. Load, Select, and Configure a Model:
a. Click Load
to get a list of available models.
b. Use the drop-down menu next to Load
to view the LM Studio models (e.g., aya-23-35b).
c. Select the model.
d. Enter an Instance Name
to identify the model in your list of models.
4.2.5. Click Add
to save the configured model to your list of providers.