From c62cef4949e1326c8da88dc46df762165c8f7b87 Mon Sep 17 00:00:00 2001
From: Ethan <36282608+etbyrd@users.noreply.github.com>
Date: Sun, 15 Dec 2024 20:09:25 -0800
Subject: [PATCH] "Switching LLMs" Docs Fixes (#1244)
Two updates on
https://docs.boundaryml.com/guide/baml-basics/switching-llms :
1. The link in the final paragraph for `provider documentation` points
to https://docs.boundaryml.com/guide/baml-basics/switching-llms#fields.
This seems like an error because in the first paragraph, `LLM Providers
Reference` points to
https://docs.boundaryml.com/ref/llm-client-providers/open-ai.
2. The `retry policies` is a deadlink. This updates it
https://docs.boundaryml.com/ref/llm-client-strategies/retry-policy
I would test it to be 100% but it looks like I need a Fern account?
----
> [!IMPORTANT]
> Fixes incorrect links in `switching-llms.mdx` for provider
documentation and retry policies.
>
> - **Documentation Fixes**:
> - Corrects `provider documentation` link in `switching-llms.mdx` to
point to `/ref/llm-client-providers/open-ai`.
> - Updates `retry policies` link in `switching-llms.mdx` to
`/ref/llm-client-strategies/retry-policy`.
>
> This description was created by [](https://www.ellipsis.dev?ref=BoundaryML%2Fbaml&utm_source=github&utm_medium=referral)
for 2bba72389e7104d2463cd91e50e2875d46c5f8e7. It will automatically
update as commits are pushed.
---------
Co-authored-by: aaronvg
---
fern/01-guide/04-baml-basics/switching-llms.mdx | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/fern/01-guide/04-baml-basics/switching-llms.mdx b/fern/01-guide/04-baml-basics/switching-llms.mdx
index ccb950ac3..92ffc2049 100644
--- a/fern/01-guide/04-baml-basics/switching-llms.mdx
+++ b/fern/01-guide/04-baml-basics/switching-llms.mdx
@@ -50,8 +50,8 @@ function MakeHaiku(topic: string) -> string {
}
```
-Consult the [provider documentation](#fields) for a list of supported providers
-and models, the default options, and setting [retry policies](/docs/reference/retry-policy).
+Consult the [provider documentation](/ref/llm-client-providers/open-ai) for a list of supported providers
+and models, the default options, and setting [retry policies](/ref/llm-client-strategies/retry-policy).
If you want to specify which client to use at runtime, in your Python/TS/Ruby code,