Skip to content

Commit

Permalink
add some new docs
Browse files Browse the repository at this point in the history
  • Loading branch information
aaronvg committed May 31, 2024
1 parent 5cedf26 commit a3ed2f5
Show file tree
Hide file tree
Showing 42 changed files with 271 additions and 998 deletions.
33 changes: 13 additions & 20 deletions docs/docs/guides/boundary_studio/tracing-tagging.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,6 @@
title: "Tracing and tagging functions"
---

<Warning>
TypeScript function tracing in Boundary Studio is also available if you setup the environment keys, but more advanced features like the @trace decorator are currently available only on Python. Contact us if you'd like to enable this in TypeScript.
</Warning>

BAML allows you to trace any function with the **@trace** decorator.
This will make the function's input and output show up in the Boundary dashboard. This works for any python function you define yourself. BAML LLM functions (or any other function declared in a .baml file) are already traced by default. Logs are only sent to the Dashboard if you setup your environment variables correctly.

Expand All @@ -20,10 +16,8 @@ Make sure you also CTRl+S a .baml file to generate the `baml_client`
In the example below, we trace each of the two functions `pre_process_text` and `full_analysis`:

```python
import pytest
from baml_client.testing import baml_test
from baml_client import baml
from baml_client.baml_types import Book, AuthorInfo
from baml_client.types import Book, AuthorInfo
from baml_client.tracing import trace

# You can also add a custom name with trace(name="my_custom_name")
Expand All @@ -42,19 +36,18 @@ async def full_analysis(book: Book):
return book_analysis


@baml_test
class TestBookAnalysis:
async def test_book1(self):
content = """Before I could reply that he [Gatsby] was my neighbor...
"""
processed_content = await pre_process_text(content)
return await full_analysis(
Book(
title="The Great Gatsby",
author=AuthorInfo(firstName="F. Scott", lastName="Fitzgerald"),
content=processed_content,
),
)
@trace
async def test_book1():
content = """Before I could reply that he [Gatsby] was my neighbor...
"""
processed_content = await pre_process_text(content)
return await full_analysis(
Book(
title="The Great Gatsby",
author=AuthorInfo(firstName="F. Scott", lastName="Fitzgerald"),
content=processed_content,
),
)
```


Expand Down
27 changes: 6 additions & 21 deletions docs/docs/guides/hello_world/baml-project-structure.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,8 @@ your application, depending on the generators configured in your `main.baml`:

```rust main.baml
generator MyGenerator{
language "python"
// This is where the generated baml-client will be written to
project_root "../"
test_command "poetry run python -m pytest"
install_command "poetry add baml@latest"
package_version_command "poetry show baml"
output_type typescript
output_dir "../"
}
```

Expand All @@ -22,20 +18,13 @@ Here is the typical project structure:
```bash
.
├── baml_client/ # Generated code
├── baml_src/ # Prompts live here
│ ├── __tests__/ # Tests loaded by playground
│ │ └── YourAIFunction/
│ │ └── test_1.json
│ ├── main.baml
│ ├── any_directory/
│ │ └── baz.baml
├── baml_src/ # Prompts and baml tests live here
│ └── foo.baml
# The rest of your project (not generated nor used by BAML)
├── app/
│ ├── __init__.py
│ └── main.py
├── pyproject.toml
└── poetry.lock
└── pyproject.toml

```

Expand All @@ -45,7 +34,7 @@ function declarations, prompts, retry policies, etc. It also contains
transpile your BAML code.

2. `baml_client/` is where the BAML compiler will generate code for you,
based on the types and functions you define in your BAML code.
based on the types and functions you define in your BAML code. Here's how you'd access the generated functions from baml_client:

<CodeGroup>
```python Python
Expand All @@ -64,9 +53,6 @@ const use_llm_for_task = async () => {
```
</CodeGroup>

3. `baml_src/__tests__/**/*.json` is where your test inputs live. The VSCode
extension allows you to load, delete, create, and run any of these these inputs.
You can also use these tests ou like. See [here](/docs/syntax/function-testing) for more information.

<Warning>
**You should never edit any files inside baml_client directory** as the whole
Expand All @@ -77,8 +63,7 @@ You can also use these tests ou like. See [here](/docs/syntax/function-testing)
<Tip>
If you ever run into any issues with the generated code (like merge
conflicts), you can always delete the `baml_client` directory and it will get
regenerated automatically once you fix any other conflicts in your `.baml`
files.
regenerated automatically on save.
</Tip>

### imports
Expand Down
72 changes: 12 additions & 60 deletions docs/docs/guides/hello_world/testing-ai-functions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,68 +3,20 @@ title: "Testing AI functions"
---


## Overview

One important way to ensure your AI functions are working as expected is to write unit tests. This is especially important when you're working with AI functions that are used in production, or when you're working with a team.

You have two options for adding / running tests:

- Using the Playground
- Using the BAML CLI

## Using the Playground

The playground allows a type-safe interface for creating tests along with running them.
Under the hood, the playground runs `baml test` for you and writes the test files to the `__tests__` folder (see below).

<iframe
src="https://player.cloudinary.com/embed/?public_id=baml-playground&cloud_name=dn7wj4mr5"
width="600"
height="400"
allow="autoplay; fullscreen; encrypted-media; picture-in-picture"
allowFullScreen
></iframe>

At the end of this video notice that the test runs are all saved into **Boundary Studio**, our analytics and observability platform, and can be accessed from within VSCode. Check out the [installation](/docs/home/installation) steps to set it up!

## Run tests using the BAML CLI

To understand how to run tests using the CLI, you'll need to understand how BAML projects are structured.

```bash
.
├── baml_src/ # Where you write BAML files
│ ├── __tests__/ # Where you write tests
│ │ ├── YourAIFunction/ # A folder for each AI function
│ │ │ └── test_name_cricket.json
│ │ └── YourAIFunction2/ # Another folder for another AI function
│ │ └── test_name_jellyfish.json
```

To run tests, you'll need to run `baml test` from the root of your project. This will run all tests in the `__tests__` folder.
To test functions:
1. Install the VSCode extension
2. Create a test in any .baml file:
```rust
test MyTest {
functions [ExtractResume]
args {
resume_text "hello"
}
}

```bash
# This will LIST all tests found in the __tests__ folder
$ baml test
# This will RUN all tests found in the __tests__ folder
$ baml test run
```
3. Run the test in the VSCode extension!

You can also run tests for a specific AI function by passing the `-i` flag.

```bash
# This will list all tests found in the __tests__/YourAIFunction folder
$ baml test -i "YourAIFunction:"

# This will run all tests found in the __tests__/YourAIFunction folder
$ baml test run -i "YourAIFunction:" -i "YourAIFunction2:"
```

For more filters on the `baml test` command, run `baml test --help`.

## Evaluating test results

`baml test` and the Playground UI (which uses baml test under the hood) don't have a way to evaluate results other than by manual inspection at the moment.

If you want to add **asserts** or **expectations** on the output, you can declare your tests programmatically using our `pytest` plugin to do so.
See [this tutorial](/docs/syntax/testing/evaluators)
We have more capabilities like assertions coming soon!
21 changes: 14 additions & 7 deletions docs/docs/guides/hello_world/writing-ai-functions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@ title: "BAML AI functions in 2 minutes"

### Pre-requisites

Follow the [installation](/v3/home/installation) instructions and run **baml init** in a new project.
Follow the [installation](/v3/home/installation) instructions.

The starting project structure will look something like this:
<img src="/images/v3/baml_filetree.png" width={200} />
{/* The starting project structure will look something like this: */}
{/* <img src="/images/v3/baml_filetree.png" width={200} /> */}

## Overview

Expand All @@ -25,11 +25,18 @@ and agents.

The best way to learn BAML is to run an example in our web playground -- [PromptFiddle.com](https://promptfiddle.com).

But at a high-level, BAML is simple to use -- prompts are built using [Jinja syntax](https://jinja.palletsprojects.com/en/3.1.x/) to make working with strings easier. We extended jinja to add type-support, and static analysis of your template variables, and a realtime side-by-side playground for VSCode.
But at a high-level, BAML is simple to use -- prompts are built using [Jinja syntax](https://jinja.palletsprojects.com/en/3.1.x/) to make working with strings easier. But we extended jinja to add type-support, static analysis of your template variables, and we have a real-time preview of prompts in the BAML VSCode extension no matter how much logic your prompts use.

We'll write out an example from PromptFiddle here:
Here's an example from PromptFiddle:

```rust baml_src/main.baml
client<llm> GPT4Turbo {
provider openai
options {
model gpt-4-turbo
api_key env.OPENAI_API_KEY
}
}
// Declare the Resume type we want the AI function to return
class Resume {
name string
Expand Down Expand Up @@ -71,14 +78,14 @@ All your types become Pydantic models in Python, or type definitions in Typescri

## 2. Usage in Python or TypeScript

Our VSCode extension automatically generates a **baml_client** in your language of choice.
Our VSCode extension automatically generates a **baml_client** in the language of choice. (Click the tabs for Python or TypeScript)

<CodeGroup>

```python Python
from baml_client import baml as b
# BAML types get converted to Pydantic models
from baml_client.baml_types import Resume
from baml_client.types import Resume
import asyncio

async def main():
Expand Down
65 changes: 24 additions & 41 deletions docs/docs/guides/streaming/streaming.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,6 @@
title: "Streaming structured data"
---

<Warning>
TypeScript support for streaming is still in closed alpha - please contact us if you would like to use it!
</Warning>

### Streaming partial objects
The following returns an object that slowly gets filled in as the response comes in. This is useful if you want to start processing the response before it's fully complete.
You can stream anything from a `string` output type, to a complex object.
Expand All @@ -18,46 +14,33 @@ Example:
{"prop1": "hello how are you", "prop2": "I'm good, how are you?", "prop3": "I'm doing great, thanks for asking!"}
```

```python
async def main():
async with baml.MyFunction.stream(MyInput(...)) as stream:
async for output in stream.parsed_stream:

if output.is_parseable:
assert output.parsed.my_property is not None
print("my property is present", output.parsed.my_property)
print(f"streaming: {output.parsed.model_dump_json()}")

# You can also get the current delta. This will always be present.
print(f"streaming: {output.delta}")

final_output = await stream.get_final_response()
if final_output.has_value:
print(f"final response: {final_output.value}")
else:
# A deserialization error likely occurred.
print(f"final resopnse didnt have a value")
### Python
```python FastAPI
from baml_client import b

@app.get("/extract_resume")
async def extract_resume(resume_text: str):
async def stream_resume(resume):
stream = b.stream.ExtractResume(resume_text)
async for chunk in stream:
yield str(chunk.model_dump_json()) + "\n"

return StreamingResponse(stream_resume(resume), media_type="text/plain")
```
You can also get the deltas from the `output` using `output.delta`

### Stream a specific impl
The following returns a stream of a specific impl. This is useful if you want to process the response as it comes in, but don't want to deal with the object being partially filled in.

```python
async def main():
async with baml.MyFunction.get_impl("v1").stream(...) as stream:
async for chunk in stream.parsed_stream:
print(f"streaming: {chunk.delta}")
### TypeScript
```typescript
import { b } from '../baml_client'; // or whatever path baml_client is in

final_output = await stream.get_final_response()
if final_output.has_value:
print(f"final response: {final_output.value}")
```
export async function streamText() {
const stream = b.stream.MyFunction(MyInput(...));
for await (const output of stream) {
console.log(`streaming: ${output}`); // this is the output type of my function
}

const finalOutput = await stream.getFinalResponse();
console.log(`final response: ${finalOutput}`);
}
```

### Caveats
Not supported with:
1. Fallback clients
2. Retry policies (it may work but there may be unknown behaviors)
4. Union types
5. TypeScript (still in progress)
Loading

0 comments on commit a3ed2f5

Please sign in to comment.