-
Notifications
You must be signed in to change notification settings - Fork 56
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'canary' into sam/json-schema-docs
- Loading branch information
Showing
82 changed files
with
1,786 additions
and
972 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,96 @@ | ||
--- | ||
title: "Client Registry" | ||
--- | ||
|
||
If you need to modify the model / parameters for an LLM client at runtime, you can modify the `ClientRegistry` for any specified function. | ||
|
||
<CodeGroup> | ||
|
||
```python Python | ||
from baml_py import ClientRegistry | ||
|
||
async def run(): | ||
cr = ClientRegistry() | ||
# Creates a new client | ||
cr.add_llm_client(name='MyAmazingClient', provider='openai', options={ | ||
"model": "gpt-4o", | ||
"temperature": 0.7, | ||
"api_key": "sk-..." | ||
}) | ||
# Sets MyAmazingClient as the primary client | ||
cr.set_primary('MyAmazingClient') | ||
|
||
# ExtractResume will now use MyAmazingClient as the calling client | ||
res = await b.ExtractResume("...", { "client_registry": cr }) | ||
``` | ||
|
||
```typescript TypeScript | ||
import { ClientRegistry } from '@boundaryml/baml' | ||
|
||
async function run() { | ||
const cr = new ClientRegistry() | ||
// Creates a new client | ||
cr.addLlmClient({ name: 'MyAmazingClient', provider: 'openai', options: { | ||
model: "gpt-4o", | ||
temperature: 0.7, | ||
api_key: "sk-..." | ||
}}) | ||
// Sets MyAmazingClient as the primary client | ||
cr.setPrimary('MyAmazingClient') | ||
|
||
// ExtractResume will now use MyAmazingClient as the calling client | ||
const res = await b.ExtractResume("...", { clientRegistry: cr }) | ||
} | ||
``` | ||
|
||
```ruby Ruby | ||
Not available yet | ||
``` | ||
|
||
</CodeGroup> | ||
|
||
## ClientRegistry Interface | ||
import ClientConstructorParams from '/snippets/client-params.mdx' | ||
|
||
|
||
<Tip> | ||
Note: `ClientRegistry` is imported from `baml_py` in Python and `@boundaryml/baml` in TypeScript, not `baml_client`. | ||
|
||
As we mature `ClientRegistry`, we will add a more type-safe and ergonomic interface directly in `baml_client`. See [Github issue #766](https://github.com/BoundaryML/baml/issues/766). | ||
</Tip> | ||
|
||
Methods use `snake_case` in Python and `camelCase` in TypeScript. | ||
|
||
### add_llm_client / addLlmClient | ||
A function to add an LLM client to the registry. | ||
|
||
<ParamField | ||
path="name" | ||
type="string" | ||
required | ||
> | ||
The name of the client. | ||
|
||
<Warning> | ||
Using the exact same name as a client also defined in .baml files overwrites the existing client whenever the ClientRegistry is used. | ||
</Warning> | ||
</ParamField> | ||
|
||
<ClientConstructorParams /> | ||
|
||
<ParamField path="retry_policy" type="string"> | ||
The name of a retry policy that is already defined in a .baml file. See [Retry Policies](/docs/snippets/clients/retry.mdx). | ||
</ParamField> | ||
|
||
### set_primary / setPrimary | ||
This sets the client for the function to use. (i.e. replaces the `client` property in a function) | ||
|
||
<ParamField | ||
path="name" | ||
type="string" | ||
required | ||
> | ||
The name of the client to use. | ||
|
||
This can be a new client that was added with `add_llm_client` or an existing client that is already in a .baml file. | ||
</ParamField> |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
<ParamField path="provider" type="string" required> | ||
This configures which provider to use. The provider is responsible for handling the actual API calls to the LLM service. The provider is a required field. | ||
|
||
The configuration modifies the URL request BAML runtime makes. | ||
|
||
| Provider Name | Docs | Notes | | ||
| -------------- | -------------------------------- | ---------------------------------------------------------- | | ||
| `openai` | [OpenAI](/docs/snippets/clients/providers/openai) | Anything that follows openai's API exactly | | ||
| `ollama` | [Ollama](/docs/snippets/clients/providers/ollama) | Alias for an openai client but with default ollama options | | ||
| `azure-openai` | [Azure OpenAI](/docs/snippets/clients/providers/azure) | | | ||
| `anthropic` | [Anthropic](/docs/snippets/clients/providers/anthropic) | | | ||
| `google-ai` | [Google AI](/docs/snippets/clients/providers/gemini) | | | ||
| `fallback` | [Fallback](/docs/snippets/clients/fallback) | Used to chain models conditional on failures | | ||
| `round-robin` | [Round Robin](/docs/snippets/clients/round-robin) | Used to load balance | | ||
|
||
</ParamField> | ||
|
||
<ParamField path="options" type="dict[str, Any]" required> | ||
These vary per provider. Please see provider specific documentation for more | ||
information. Generally they are pass through options to the POST request made | ||
to the LLM. | ||
</ParamField> | ||
|
File renamed without changes.
Oops, something went wrong.