-
Notifications
You must be signed in to change notification settings - Fork 2
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
285c99b
commit 344f8f2
Showing
10 changed files
with
422 additions
and
534 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
|
||
## Working with tools and agents | ||
Sharing tools and agents on the HF Hub in a standardized way is not implemented yet. | ||
This page contains some initial thoughts on this. | ||
|
||
|
||
### How to handle tools? | ||
Potential standard ways of storing tools: | ||
- JSON files: Tool use and function calling is often handled via JSON strings and different libraries then provide different abstractions on top of this. | ||
- .py file: libraries like `LangChain` or `Transformers.Agents` enable the use of tools/functions via normal python functions with doc strings and a decorator. This would be less universally compatible/interoperable though. | ||
|
||
`Transformers.Agents` currently has [Tool.push_to_hub](https://huggingface.co/docs/transformers/v4.45.2/en/main_classes/agent#transformers.Tool.push_to_hub) which pushes tools to the hub as a Space. This makes sense if users want a hosted tool with compute, but it is not interoperable with API client libraries. Some tools & prompts are stored [here](https://huggingface.co/huggingface-tools). | ||
|
||
|
||
### How to handle agents? | ||
TBD | ||
|
||
|
||
<!-- | ||
3. Example: Agent Model Repo | ||
- (maybe:) OAI MLEBench Agents/Dataset: https://github.com/openai/mle-bench (Seems like no nice tabular dataset provided.) | ||
- Or Aymeric's GAIA prompts | ||
--> |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,37 +1,77 @@ | ||
# HF Hub Prompts | ||
# Hugging Face Hub Prompts | ||
Prompts have become a key artifact for researchers and practitioners working with AI. | ||
There is, however, no standardized way of sharing prompts. | ||
Prompts are shared on the HF Hub in [.txt files](https://huggingface.co/HuggingFaceFW/fineweb-edu-classifier/blob/main/utils/prompt.txt), | ||
in [HF datasets](https://huggingface.co/datasets/fka/awesome-chatgpt-prompts), | ||
as strings in [model cards](https://huggingface.co/OpenGVLab/InternVL2-8B#grounding-benchmarks), | ||
or on GitHub as [python strings](https://github.com/huggingface/cosmopedia/tree/main/prompts), | ||
in [JSON, YAML](https://github.com/hwchase17/langchain-hub/blob/master/prompts/README.md), | ||
or in [Jinja2](https://github.com/argilla-io/distilabel/tree/main/src/distilabel/steps/tasks/templates). | ||
|
||
A Python library for sharing and downloading prompts on the Hugging Face Hub. | ||
|
||
## Overview | ||
|
||
HF Hub Prompts is a Python library that makes it easy to share and download prompts using the Hugging Face Hub infrastructure. It supports both standard text prompts and chat-based prompts, with features for template variables and formatting for different LLM clients. | ||
## Objectives and non-objectives of this library | ||
### Objectives | ||
1. Provide a Python library that simplifies and standardises the sharing of prompts on the Hugging Face Hub. | ||
2. Start an open discussion on the best way of standardizing and | ||
encouraging the sharing of prompts on the HF Hub, building upon the HF Hub's existing repository types | ||
and ensuring interoperability with other prompt-related libraries. | ||
### Non-Objectives: | ||
- Compete with full-featured prompting libraries like [LangChain](https://github.com/langchain-ai/langchain), | ||
[ell](https://docs.ell.so/reference/index.html), etc. The objective is, instead, | ||
a simple solution for sharing prompts on the HF Hub, which is compatible with | ||
other libraries and which the community can build upon. | ||
|
||
## Features | ||
|
||
- Download prompts from the Hugging Face Hub | ||
- Support for both text and chat-based prompts | ||
- Template variable system | ||
- Format prompts for different LLM clients (OpenAI, Anthropic) | ||
- LangChain compatibility | ||
- YAML-based prompt storage | ||
## Quick start | ||
Install the package: | ||
|
||
## Quick Start | ||
```bash | ||
pip install hf-hub-prompts | ||
``` | ||
|
||
|
||
### Basic usage | ||
|
||
```python | ||
from hf_hub_prompts import download_prompt | ||
|
||
# Download a prompt template | ||
prompt = download_prompt( | ||
repo_id="username/repo_name", | ||
filename="my-prompt.yaml" | ||
) | ||
|
||
# Use the template | ||
populated = prompt.populate_template( | ||
variable1="value1", | ||
variable2="value2" | ||
) | ||
|
||
# Get the final text | ||
text = populated.content | ||
``` | ||
>>> # 1. List available prompts in a Hub repository: | ||
>>> from hf_hub_prompts import list_prompt_templates | ||
>>> files = list_prompt_templates("MoritzLaurer/example_prompts") | ||
>>> files | ||
['code_teacher.yaml', 'translate.yaml'] | ||
|
||
>>> # 2. Download a prompt template: | ||
>>> from hf_hub_prompts import download_prompt_template | ||
>>> prompt_template = download_prompt_template( | ||
... repo_id="MoritzLaurer/example_prompts", | ||
... filename="code_teacher.yaml" | ||
... ) | ||
|
||
>>> # 3. Inspect the template: | ||
>>> prompt_template.messages | ||
[{'role': 'system', 'content': 'You are a coding assistant who explains concepts clearly and provides short examples.'}, {'role': 'user', 'content': 'Explain what {concept} is in {programming_language}.'}] | ||
>>> # Check required input variables | ||
>>> prompt_template.input_variables | ||
['concept', 'programming_language'] | ||
|
||
>>> # 4. Populate the template with variables | ||
>>> messages = prompt_template.create_messages( | ||
... concept="list comprehension", | ||
... programming_language="Python" | ||
... ) | ||
>>> # By default, the populated prompt is in the OpenAI messages format | ||
>>> # which is compatible with many open-source LLM clients | ||
>>> messages | ||
[{'role': 'system', 'content': 'You are a coding assistant who explains concepts clearly and provides short examples.'}, {'role': 'user', 'content': 'Explain what list comprehension is in Python.'}] | ||
|
||
>>> # You can also format for other clients, e.g. Anthropic | ||
>>> messages_anthropic = prompt_template.create_messages( | ||
... client="anthropic", | ||
... concept="list comprehension", | ||
... programming_language="Python" | ||
... ) | ||
>>> messages_anthropic | ||
{'system': 'You are a coding assistant who explains concepts clearly and provides short examples.', 'messages': [{'role': 'user', 'content': 'Explain what list comprehension is in Python.'}]} | ||
|
||
``` | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.