Skip to content

Commit

Permalink
updated docs (#70)
Browse files Browse the repository at this point in the history
  • Loading branch information
ShreyaR authored Mar 20, 2023
1 parent 9715037 commit d1588c3
Show file tree
Hide file tree
Showing 5 changed files with 15 additions and 13 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ _Note: Guardrails is an alpha release, so expect sharp edges and bugs._

Guardrails is a Python package that lets a user add structure, type and quality guarantees to the outputs of large language models (LLMs). Guardrails:

does pydantic-style validation of LLM outputs,
takes corrective actions (e.g. reasking LLM) when validation fails,
enforces structure and type guarantees (e.g. JSON).
- does pydantic-style validation of LLM outputs (including semantic validation such as checking for bias in generated text, checking for bugs in generated code, etc.)
- takes corrective actions (e.g. reasking LLM) when validation fails,
- enforces structure and type guarantees (e.g. JSON).


## 🚒 Under the hood
Expand Down
2 changes: 1 addition & 1 deletion docs/guard.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<!-- ::: my_library.my_module.my_class -->


::: guardrails.guardrails.Guard
::: guardrails.guard.Guard
options:
members:
- "from_rail"
Expand Down
12 changes: 6 additions & 6 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ _Note: Guardrails is an alpha release, so expect sharp edges and bugs._

Guardrails is a Python package that lets a user add structure, type and quality guarantees to the outputs of large language models (LLMs). Guardrails:

does pydantic-style validation of LLM outputs,
takes corrective actions (e.g. reasking LLM) when validation fails,
enforces structure and type guarantees (e.g. JSON).
- does pydantic-style validation of LLM outputs. This includes semantic validation such as checking for bias in generated text, checking for bugs in generated code, etc.
- takes corrective actions (e.g. reasking LLM) when validation fails,
- enforces structure and type guarantees (e.g. JSON).

## 🚒 Under the hood

Expand Down Expand Up @@ -40,12 +40,12 @@ To learn more about the `rail` spec and the design decisions behind it, check ou
## 📍 Roadmap

- [ ] Adding more examples, new use cases and domains
- [ ] Adding integrations with langchain, gpt-index, minichain, manifest
- [x] Adding integrations with langchain, gpt-index, minichain, manifest
- [ ] Expanding validators offering
- [ ] More compilers from `.rail` -> LLM prompt (e.g. `.rail` -> TypeScript)
- [ ] Informative logging
- [ ] Improving reasking logic
- [x] Improving reasking logic
- [ ] A guardrails.js implementation
- [ ] VSCode extension for `.rail` files
- [ ] Next version of `.rail` format
- [ ] Add more LLM providers
- [x] Add more LLM providers
7 changes: 4 additions & 3 deletions docs/integrations/pydantic_validation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
"# Validating LLM Outputs with Pydantic\n",
"\n",
"!!! note\n",
" To download this example as a Jupyter notebook, click [here](https://github.com/ShreyaR/guardrails/blob/main/docs/examples/pydantic_validation.ipynb).\n",
" To download this example as a Jupyter notebook, click [here](https://github.com/ShreyaR/guardrails/blob/main/docs/integrations/pydantic_validation.ipynb).\n",
"\n",
"In this example, we will use Guardrails with Pydantic.\n",
"\n",
Expand Down Expand Up @@ -38,8 +38,9 @@
"Ordinarily, we would create an RAIL spec in a separate file. For the purposes of this example, we will create the spec in this notebook as a string following the RAIL syntax. For more information on RAIL, see the [RAIL documentation](../rail/output.md).\n",
"\n",
"Here, we define a Pydantic model for a `Person` with the following fields:\n",
"- `name`: a string\n",
"- `age`: an integer\n",
"\n",
"- `name`: a string \n",
"- `age`: an integer \n",
"- `zip_code`: a string zip code\n",
"\n",
"and write very simple validators for the fields as an example. As a way to show how LLM reasking can be used to generate data that is consistent with the Pydantic model, we can define a validator that asks for a zip code in California (including being perversely opposed to the \"90210\" zip code). If this validator fails, the LLM will be sent the error message and will reask the question.\n",
Expand Down
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ nav:
# - 'SFW tutoring system for kids': examples/sfw_tutoring.md
- 'Integrations':
- 'LangChain': integrations/langchain.ipynb
- 'Pydantic': integrations/pydantic_validation.ipynb


theme:
Expand Down

0 comments on commit d1588c3

Please sign in to comment.