From 22bdb33952abe7248cca09aa5040006080adc1dc Mon Sep 17 00:00:00 2001 From: ngxson Date: Sat, 16 Mar 2024 13:13:27 +0100 Subject: [PATCH] readme: add wllama binding --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 5cbdf7e476b09..c2f3342f0ff42 100644 --- a/README.md +++ b/README.md @@ -134,6 +134,7 @@ Typically finetunes of the base models below are supported as well. - Node.js: [withcatai/node-llama-cpp](https://github.com/withcatai/node-llama-cpp) - JS/TS (llama.cpp server client): [lgrammel/modelfusion](https://modelfusion.dev/integration/model-provider/llamacpp) - JavaScript/Wasm (works in browser): [tangledgroup/llama-cpp-wasm](https://github.com/tangledgroup/llama-cpp-wasm) +- Typescript/Wasm (nicer API, available on npm): [ngxson/wllama](https://github.com/ngxson/wllama) - Ruby: [yoshoku/llama_cpp.rb](https://github.com/yoshoku/llama_cpp.rb) - Rust (nicer API): [mdrokz/rust-llama.cpp](https://github.com/mdrokz/rust-llama.cpp) - Rust (more direct bindings): [utilityai/llama-cpp-rs](https://github.com/utilityai/llama-cpp-rs)