From bd215ce8c954558f0c4f087285c2e63ee83a6443 Mon Sep 17 00:00:00 2001 From: Alejandro Date: Tue, 13 Aug 2024 16:09:42 -0700 Subject: [PATCH 1/5] Added installing validators concepts page with incode examples --- docs/concepts/installing_validators.md | 72 ++++++++++++++++++++++++++ docusaurus/sidebars.js | 1 + 2 files changed, 73 insertions(+) create mode 100644 docs/concepts/installing_validators.md diff --git a/docs/concepts/installing_validators.md b/docs/concepts/installing_validators.md new file mode 100644 index 000000000..7d7f311d5 --- /dev/null +++ b/docs/concepts/installing_validators.md @@ -0,0 +1,72 @@ +# Installing Validators + +## Guardrails Hub + +Validators can be combined together into Input and Output Guards that intercept the inputs and outputs of LLMs. There are a large collection of Validators which can be found at the [Guardrails Hub](https://hub.guardrailsai.com/). + +
+Guardrails Hub gif +
+ + +## Installing + +Once you have found a Validator on the hub, you can click on the Validator REAMDE to find the install link. + +### Using CLI + +You can install a validator using the Guardrails CLI. For example the [Toxic Language](https://hub.guardrailsai.com/validator/guardrails/toxic_language) validator can be installed with: + +```bash +guardrails hub install hub://guardrails/toxic_language +``` + +At which point you can start to use the Validator: + +```python +from guardrails.hub import ToxicLanguage +from guardrails import Guard + +guard = Guard().use( + ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" +) + +guard.validate("My landlord is an asshole!") +``` + +### In Code Installs - Pattern A + +You can also install validators using the Guardrails SDK which simplifies development particularly when using Jupyter Notebooks: + +```python +from guardrails import install + +install("hub://guardrails/toxic_language") + +from guardrails.hub import ToxicLanguage + +guard = Guard().use( + ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" +) + +guard.validate("My landlord is an asshole!") +``` +> Note: Invoking `install` always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. + +### In Code Installs - Pattern B + +You can also extract the validator directly from the installed module as follows: + +```python +from guardrails import install + +ToxicLanguage = install("hub://guardrails/toxic_language").ToxicLanguage + +guard = Guard().use( + ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" +) + +guard.validate("My landlord is an asshole!") +``` + +> Note: Invoking `install` always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 0a45a95ad..6f30da094 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -51,6 +51,7 @@ const sidebars = { concepts: [ "concepts/guard", "concepts/validators", + "concepts/installing_validators", // "concepts/guardrails", "concepts/hub", "concepts/deploying", From 2d51424d06d32e79333ae3613bb89f4813449037 Mon Sep 17 00:00:00 2001 From: Alejandro Date: Tue, 13 Aug 2024 16:30:37 -0700 Subject: [PATCH 2/5] updates to validator doc --- docs/concepts/installing_validators.md | 26 ++++++++++++++++++++++---- 1 file changed, 22 insertions(+), 4 deletions(-) diff --git a/docs/concepts/installing_validators.md b/docs/concepts/installing_validators.md index 7d7f311d5..c8f1ad3ac 100644 --- a/docs/concepts/installing_validators.md +++ b/docs/concepts/installing_validators.md @@ -11,7 +11,7 @@ Validators can be combined together into Input and Output Guards that intercept ## Installing -Once you have found a Validator on the hub, you can click on the Validator REAMDE to find the install link. +Once you have found a Validator on the hub, you can click on the Validator `README` to find the install link. ### Using CLI @@ -21,6 +21,10 @@ You can install a validator using the Guardrails CLI. For example the [Toxic Lan guardrails hub install hub://guardrails/toxic_language ``` +> This will not download local models if you opted into remote inferencing during `guardrails configure` + +If you want to control if associated models are downloaded or not you can use the `--install-local-models` or `--no-install-local-models` flags respectively during `guardrails hub install` + At which point you can start to use the Validator: ```python @@ -34,9 +38,23 @@ guard = Guard().use( guard.validate("My landlord is an asshole!") ``` +### In Code Installs + +You can also install validators using the Guardrails SDK which simplifies development particularly when using Jupyter Notebooks. + +```python +from guardrails import install + +install( + "hub://guardrails/toxic_language", + install_local_models=True, # defaults to `None` - which will not download local models if you opted into remote inferencing. + quiet=False # defaults to `True` +) +``` + ### In Code Installs - Pattern A -You can also install validators using the Guardrails SDK which simplifies development particularly when using Jupyter Notebooks: +After an `install` invocation you can import a validator as you typically would: ```python from guardrails import install @@ -51,7 +69,6 @@ guard = Guard().use( guard.validate("My landlord is an asshole!") ``` -> Note: Invoking `install` always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. ### In Code Installs - Pattern B @@ -69,4 +86,5 @@ guard = Guard().use( guard.validate("My landlord is an asshole!") ``` -> Note: Invoking `install` always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. \ No newline at end of file + +> Note: Invoking the `install` SDK always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. \ No newline at end of file From 7845a401cf7d23991a43d00da77210abe5791d89 Mon Sep 17 00:00:00 2001 From: Alejandro Date: Tue, 13 Aug 2024 16:32:03 -0700 Subject: [PATCH 3/5] added installation to validators doc --- docs/concepts/installing_validators.md | 90 -------------------------- docs/concepts/validators.md | 86 ++++++++++++++++++++++++ docusaurus/sidebars.js | 1 - 3 files changed, 86 insertions(+), 91 deletions(-) delete mode 100644 docs/concepts/installing_validators.md diff --git a/docs/concepts/installing_validators.md b/docs/concepts/installing_validators.md deleted file mode 100644 index c8f1ad3ac..000000000 --- a/docs/concepts/installing_validators.md +++ /dev/null @@ -1,90 +0,0 @@ -# Installing Validators - -## Guardrails Hub - -Validators can be combined together into Input and Output Guards that intercept the inputs and outputs of LLMs. There are a large collection of Validators which can be found at the [Guardrails Hub](https://hub.guardrailsai.com/). - -
-Guardrails Hub gif -
- - -## Installing - -Once you have found a Validator on the hub, you can click on the Validator `README` to find the install link. - -### Using CLI - -You can install a validator using the Guardrails CLI. For example the [Toxic Language](https://hub.guardrailsai.com/validator/guardrails/toxic_language) validator can be installed with: - -```bash -guardrails hub install hub://guardrails/toxic_language -``` - -> This will not download local models if you opted into remote inferencing during `guardrails configure` - -If you want to control if associated models are downloaded or not you can use the `--install-local-models` or `--no-install-local-models` flags respectively during `guardrails hub install` - -At which point you can start to use the Validator: - -```python -from guardrails.hub import ToxicLanguage -from guardrails import Guard - -guard = Guard().use( - ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" -) - -guard.validate("My landlord is an asshole!") -``` - -### In Code Installs - -You can also install validators using the Guardrails SDK which simplifies development particularly when using Jupyter Notebooks. - -```python -from guardrails import install - -install( - "hub://guardrails/toxic_language", - install_local_models=True, # defaults to `None` - which will not download local models if you opted into remote inferencing. - quiet=False # defaults to `True` -) -``` - -### In Code Installs - Pattern A - -After an `install` invocation you can import a validator as you typically would: - -```python -from guardrails import install - -install("hub://guardrails/toxic_language") - -from guardrails.hub import ToxicLanguage - -guard = Guard().use( - ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" -) - -guard.validate("My landlord is an asshole!") -``` - -### In Code Installs - Pattern B - -You can also extract the validator directly from the installed module as follows: - -```python -from guardrails import install - -ToxicLanguage = install("hub://guardrails/toxic_language").ToxicLanguage - -guard = Guard().use( - ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" -) - -guard.validate("My landlord is an asshole!") -``` - - -> Note: Invoking the `install` SDK always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. \ No newline at end of file diff --git a/docs/concepts/validators.md b/docs/concepts/validators.md index ad83df7f2..7b6882892 100644 --- a/docs/concepts/validators.md +++ b/docs/concepts/validators.md @@ -107,4 +107,90 @@ git clone git@github.com:guardrails-ai/validator-template.git Once the repository is cloned and the validator is created, you can register the validator via this [Google Form](https://forms.gle/N6UaE9611niuMxZj7). +## Installing Validators +### Guardrails Hub + +Validators can be combined together into Input and Output Guards that intercept the inputs and outputs of LLMs. There are a large collection of Validators which can be found at the [Guardrails Hub](https://hub.guardrailsai.com/). + +
+Guardrails Hub gif +
+ +Once you have found a Validator on the hub, you can click on the Validator `README` to find the install link. + +### Using CLI + +You can install a validator using the Guardrails CLI. For example the [Toxic Language](https://hub.guardrailsai.com/validator/guardrails/toxic_language) validator can be installed with: + +```bash +guardrails hub install hub://guardrails/toxic_language +``` + +> This will not download local models if you opted into remote inferencing during `guardrails configure` + +If you want to control if associated models are downloaded or not you can use the `--install-local-models` or `--no-install-local-models` flags respectively during `guardrails hub install` + +At which point you can start to use the Validator: + +```python +from guardrails.hub import ToxicLanguage +from guardrails import Guard + +guard = Guard().use( + ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" +) + +guard.validate("My landlord is an asshole!") +``` + +### In Code Installs + +You can also install validators using the Guardrails SDK which simplifies development particularly when using Jupyter Notebooks. + +```python +from guardrails import install + +install( + "hub://guardrails/toxic_language", + install_local_models=True, # defaults to `None` - which will not download local models if you opted into remote inferencing. + quiet=False # defaults to `True` +) +``` + +### In Code Installs - Pattern A + +After an `install` invocation you can import a validator as you typically would: + +```python +from guardrails import install + +install("hub://guardrails/toxic_language") + +from guardrails.hub import ToxicLanguage + +guard = Guard().use( + ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" +) + +guard.validate("My landlord is an asshole!") +``` + +### In Code Installs - Pattern B + +You can also extract the validator directly from the installed module as follows: + +```python +from guardrails import install + +ToxicLanguage = install("hub://guardrails/toxic_language").ToxicLanguage + +guard = Guard().use( + ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception" +) + +guard.validate("My landlord is an asshole!") +``` + + +> Note: Invoking the `install` SDK always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 6f30da094..0a45a95ad 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -51,7 +51,6 @@ const sidebars = { concepts: [ "concepts/guard", "concepts/validators", - "concepts/installing_validators", // "concepts/guardrails", "concepts/hub", "concepts/deploying", From 28730c50ab0fa138180a5d5380cf41ff8189acce Mon Sep 17 00:00:00 2001 From: Alejandro Date: Tue, 13 Aug 2024 16:33:35 -0700 Subject: [PATCH 4/5] updates to validator doc --- docs/concepts/validators.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/concepts/validators.md b/docs/concepts/validators.md index 7b6882892..fc03e3de5 100644 --- a/docs/concepts/validators.md +++ b/docs/concepts/validators.md @@ -129,9 +129,9 @@ guardrails hub install hub://guardrails/toxic_language > This will not download local models if you opted into remote inferencing during `guardrails configure` -If you want to control if associated models are downloaded or not you can use the `--install-local-models` or `--no-install-local-models` flags respectively during `guardrails hub install` +> If you want to control if associated models are downloaded or not you can use the `--install-local-models` or `--no-install-local-models` flags respectively during `guardrails hub install` -At which point you can start to use the Validator: +After installing the validator with the CLI you can start to use the validator in your guards: ```python from guardrails.hub import ToxicLanguage From 93205acb2fa869d188f39e381497021915d28b45 Mon Sep 17 00:00:00 2001 From: Alejandro Esquivel Date: Tue, 13 Aug 2024 16:50:50 -0700 Subject: [PATCH 5/5] Update docs/concepts/validators.md Co-authored-by: dtam --- docs/concepts/validators.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/concepts/validators.md b/docs/concepts/validators.md index fc03e3de5..9d0cb9ff6 100644 --- a/docs/concepts/validators.md +++ b/docs/concepts/validators.md @@ -193,4 +193,4 @@ guard.validate("My landlord is an asshole!") ``` -> Note: Invoking the `install` SDK always installs the validator module so it's recommended for the install to be in a seperate code block when using Notebooks. +> Note: Invoking the `install` SDK always installs the validator module so it's recommended for the install to be in a separate code block when using Notebooks.