Skip to content

Commit

Permalink
πŸ“š Add LlamaCPP and grammars info
Browse files Browse the repository at this point in the history
  • Loading branch information
shroominic committed Jan 30, 2024
1 parent 8dfa095 commit b116b83
Showing 1 changed file with 16 additions and 0 deletions.
16 changes: 16 additions & 0 deletions docs/concepts/local-models.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,21 @@
# Local Models

Funcchain supports local models through the [llama.cpp](https://github.com/ggerganov/llama.cpp) project using the [llama_cpp_python](https://llama-cpp-python.readthedocs.io/en/latest/) bindings.

## LlamaCPP

Written in highly optimized C++ code, LlamaCPP is a library for running large language models locally.
It uses GGUF files which are a binary format for storing quantized versions of large language models.
You can download alot of GGUF models from TheBloke on huggingface.

## Grammars

Context Free Grammars are a powerful abstraction for a deterministic shape of a string.
Funcchain utilizes this by forcing local models to respond in a structured way.

For example you can create a grammar that forces the model to always respond with a json object.
This is useful if you want to use the output of the model in your code.

Going one step further you can also create a grammar that forces the model to respond with a specific pydantic model.

This is how funcchain is able to use local models in a structured way.

0 comments on commit b116b83

Please sign in to comment.