This is a fork of https://github.com/QuangBK/localLLM_guidance/ with added models and example agents.
Standard prompt is a blank guidance prompt so you can design your own agent. Once you are satisified, you can create an agent file in the 'server' folder. You can start by copying the UniversalMarkdown agent and adding your prompt.
Generally speaking the input variables are "query" (the input box) and "resolver" (the output box) if your agent has a guidance resolver variable.
How to run
- Download the models. For now, they are hard coded in ./app.py and you will need to change them to your local path.
- Set your models home directory in app.py
MODEL_DIRECTORY = "/home/shazam"
- Run the server
Optionally: Install GPTQ-for-LLaMA following the Oobabooga instructions https://github.com/oobabooga/text-generation-webui/blob/main/docs/GPTQ-models-(4-bit-mode).md At this time, they are using a forked version of GPTQ-for-LLaMA. Please pay special attention to the instructions above.
python3 app.py
"StandardPrompt", "COTpromptBuilder", "COTpromptBuilder2PromptResponse", "AIDecisionMakerSimulator", "SearchToolAgentPOC", "AgentGuidanceSmartGPT", "ChatGPTAgentGuidance", "AgentGuidanceFlowGPT", "UniversalAnythingToJSON", "UniversalAnythingToMarkdown"]
- StandardPrompt is a blank guidance prompt so you can design your own agent.
- COTpromptBuilder is based on Connect multiple ChatGPT sessions w/ dynamic ChatGPT prompts https://www.youtube.com/watch?v=8PbpFxPibJM
- COTpromptBuilder2PromptResponse is the above, but the resolver is the result.
- AIDecisionMakerSimulator is an experimental simple agent that uses a decision tree to make a decision. Based on Henky!! from KoboaldAI and crew.
- SearchToolAgentPOC is an experimental agent that uses a search tool to find the answer. NOTE: GoogleSerp is disabled and instead I am using SearX. It must be installed. I use the docker version. https://python.langchain.com/en/latest/reference/modules/searx_search.html?highlight=searx
- AgentGuidanceSmartGPT is based on another youtube video by code4AI https://www.youtube.com/@code4AI
- ChatGPTAgentGuidance is just an example of using ChatML with guidance
- AgentGuidanceFlowGPT is an attempt to use FlowGPT Proteus
- UniversalAnythingToJSON converts anything (!) to JSON
- UniversalAnythingToMarkdown converts anything (including JSON) to Markdown
Original readme:
The Guidance is a tool for controlling LLM. It provides a good concept to build prompt templates. This repository shows you how to make a agent with Guidance. You can combine it with various LLMs in Huggingface. My medium article for more explanation.
UPDATE: Added gradio UI.
Python packages:
- guidance
- GPTQ-for-LLaMa
- langchain
- gradio (Only for web UI)
Note: we only use langchain for build the GoogleSerper
tool. The agent itself is built only by Guidance. Feel free to change/add/modify the tools with your goal.
The GPTQ-for-LLaMa I used is the oobabooga's fork. You can install it with this command.
There are two options: run a Gradio server with UI and run the notebook file.
Please modify the SERPER_API_KEY
, MODEL_PATH
, CHECKPOINT_PATH
in the app.py and run:
gradio app.py
Please check the notebook file. You should have a free SERPER API KEY and a LLM model to run this. I use the wizard-mega-13B-GPTQ model. Feel free to try others.