Skip to content

Commit

Permalink
Improve progress bar
Browse files Browse the repository at this point in the history
Set default width to whatever the terminal is. Also fixed a small bug around
default n_gpu_layers value.

Signed-off-by: Eric Curtin <[email protected]>
  • Loading branch information
ericcurtin committed Dec 14, 2024
1 parent 56eea07 commit 07a38d0
Show file tree
Hide file tree
Showing 2 changed files with 183 additions and 96 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -409,7 +409,7 @@ To learn more about model quantization, [read this documentation](examples/quant

</details>

[^1]: [examples/perplexity/README.md](examples/perplexity/README.md)
[^1]: [examples/perplexity/README.md](https://github.com/ggerganov/llama.cpp/blob/master/examples/perplexity/README.md)
[^2]: [https://huggingface.co/docs/transformers/perplexity](https://huggingface.co/docs/transformers/perplexity)

## [`llama-bench`](example/bench)
Expand Down Expand Up @@ -446,7 +446,7 @@ To learn more about model quantization, [read this documentation](examples/quant
</details>
[^3]: [https://github.com/containers/ramalama](RamaLama)
[^3]: [RamaLama](https://github.com/containers/ramalama)
## [`llama-simple`](examples/simple)
Expand Down
Loading

0 comments on commit 07a38d0

Please sign in to comment.