Skip to content

Commit

Permalink
Merge branch 'canary' into sam/py-release2
Browse files Browse the repository at this point in the history
  • Loading branch information
sxlijin authored Dec 5, 2024
2 parents aaa6614 + 6f99a28 commit bae3962
Show file tree
Hide file tree
Showing 10 changed files with 48 additions and 15 deletions.
7 changes: 7 additions & 0 deletions .github/workflows/build-python-release.reusable.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,13 @@ jobs:
env: ${{ matrix._.env || fromJSON('{}') }}
with:
target: ${{ matrix._.target }}
# TODO: unpin the maturin version
# 1.7.7+ builds wheels that gh-action-pypi-publish can't upload
# see:
# - release failure: https://github.com/BoundaryML/baml/actions/runs/12154379580/job/33894619438
# - maturin changelog: https://github.com/PyO3/maturin/blob/ba4d482809a73669242bd7fe7dd5f9106f42702f/Changelog.md?plain=1#L13
# - gh-action-pypi-publish issue: https://github.com/pypa/gh-action-pypi-publish/issues/310
maturin-version: "1.7.6"
command: build
# building in engine/ ensures that we pick up .cargo/config.toml
working-directory: engine
Expand Down
2 changes: 2 additions & 0 deletions fern/03-reference/baml/clients/providers/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,8 @@ client<llm> MyClient {
```
</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata.mdx" />

<Markdown src="/snippets/supports-streaming.mdx" />
Expand Down
31 changes: 26 additions & 5 deletions fern/03-reference/baml/clients/providers/aws-bedrock.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,13 @@ Example:
client<llm> MyClient {
provider aws-bedrock
options {
model_id "anthropic.claude-3-5-sonnet-20240620-v1:0"
api_key env.MY_OPENAI_KEY
inference_configuration {
max_tokens 100
}
// model_id "mistral.mistral-7b-instruct-v0:2"
// model "anthropic.claude-3-5-sonnet-20240620-v1:0"
// model_id "anthropic.claude-3-haiku-20240307-v1:0"
model "meta.llama3-8b-instruct-v1:0"
}
}
```
Expand All @@ -26,9 +31,9 @@ mechanisms supported by the
SDK](https://docs.rs/aws-config/latest/aws_config/index.html), including but not
limited to:

- `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` as set in your environment variables
- loading the specified `AWS_PROFILE` from `~/.aws/config`
- built-in authn for services running in EC2, ECS, Lambda, etc.
- You can also specify the access key ID, secret access key, and region directly (see below)


## Playground setup
Expand All @@ -50,6 +55,8 @@ Add these three environment variables to your extension variables to use the AWS
We don't have any checks for this field, you can pass any string you wish.
</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />
<Markdown src="/snippets/supports-streaming.mdx" />

Expand All @@ -62,6 +69,21 @@ Add these three environment variables to your extension variables to use the AWS
We don't have any checks for this field, you can pass any string you wish.
</ParamField>

<ParamField
path="access_key_id"
type="string"
>
The AWS access key ID to use. **Default: `AWS_ACCESS_KEY_ID` environment variable**
</ParamField>

<ParamField
path="secret_access_key"
type="string"
>
The AWS secret access key to use. **Default: `AWS_SECRET_ACCESS_KEY` environment variable**
</ParamField>


## Forwarded options

<ParamField
Expand All @@ -72,7 +94,7 @@ Add these three environment variables to your extension variables to use the AWS
</ParamField>

<ParamField
path="model_id (or model)"
path="model (or model_id)"
type="string"
>
The model to use.
Expand Down Expand Up @@ -110,7 +132,6 @@ client<llm> MyClient {
max_tokens 1000
temperature 1.0
top_p 0.8
stop_sequence ["_EOF"]
}
}
}
Expand Down
2 changes: 2 additions & 0 deletions fern/03-reference/baml/clients/providers/azure.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,8 @@ client<llm> MyClient {
```
</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming.mdx" />
Expand Down
2 changes: 2 additions & 0 deletions fern/03-reference/baml/clients/providers/google-ai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,8 @@ client<llm> MyClient {
```
</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming.mdx" />
Expand Down
2 changes: 2 additions & 0 deletions fern/03-reference/baml/clients/providers/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,8 @@ client<llm> MyClient {
```
</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming.mdx" />
Expand Down
2 changes: 2 additions & 0 deletions fern/03-reference/baml/clients/providers/openai-generic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,8 @@ client<llm> MyClient {

</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming.mdx" />
Expand Down
2 changes: 2 additions & 0 deletions fern/03-reference/baml/clients/providers/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,8 @@ client<llm> MyClient {

</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming-openai.mdx" />
Expand Down
11 changes: 2 additions & 9 deletions fern/03-reference/baml/clients/providers/vertex.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -171,15 +171,6 @@ The options are passed through directly to the API, barring a few. Here's a shor




<ParamField
path="default_role"
type="string"
>
The default role for any prompts that don't specify a role. **Default: `user`**

</ParamField>

<ParamField
path="model"
type="string"
Expand Down Expand Up @@ -217,6 +208,8 @@ client<llm> MyClient {
```
</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming.mdx" />
Expand Down
2 changes: 1 addition & 1 deletion fern/snippets/role-selection.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,5 +13,5 @@
>
Which roles should we forward to the API? **Default: `["system", "user", "assistant"]` usually, but some models like OpenAI's `o1-mini` will use `["user", "assistant"]`**

Anything not in this list will be set to the `default_role`.
When building prompts, any role not in this list will be set to the `default_role`.
</ParamField>

0 comments on commit bae3962

Please sign in to comment.