Skip to content

Commit

Permalink
readme : add LLMUnity to UI projects (#9381)
Browse files Browse the repository at this point in the history
* add LLMUnity to UI projects

* add newline to examples/rpc/README.md to fix editorconfig-checker unit test
  • Loading branch information
amakropoulos authored Sep 9, 2024
1 parent 54f376d commit 5ed0875
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,7 @@ Unless otherwise noted these projects are open-source with permissive licensing:
- [AI Sublime Text plugin](https://github.com/yaroslavyaroslav/OpenAI-sublime-text) (MIT)
- [AIKit](https://github.com/sozercan/aikit) (MIT)
- [LARS - The LLM & Advanced Referencing Solution](https://github.com/abgulati/LARS) (AGPL)
- [LLMUnity](https://github.com/undreamai/LLMUnity) (MIT)

*(to have a project listed here, it should clearly state that it depends on `llama.cpp`)*

Expand Down
3 changes: 2 additions & 1 deletion examples/rpc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,4 +70,5 @@ Finally, when running `llama-cli`, use the `--rpc` option to specify the host an
$ bin/llama-cli -m ../models/tinyllama-1b/ggml-model-f16.gguf -p "Hello, my name is" --repeat-penalty 1.0 -n 64 --rpc 192.168.88.10:50052,192.168.88.11:50052 -ngl 99
```

This way you can offload model layers to both local and remote devices.
This way you can offload model layers to both local and remote devices.

0 comments on commit 5ed0875

Please sign in to comment.