-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add man pages for ramalama commands #16
Conversation
README.md
Outdated
@@ -12,14 +12,37 @@ curl -fsSL https://raw.githubusercontent.com/containers/ramalama/main/install.sh | |||
|
|||
## Usage | |||
|
|||
### Listing Models | |||
|
|||
You can `list` all models pulled into lcoal storage. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/lcoal/local/g
README.md
Outdated
|
||
### Serving Models | ||
|
||
You can `serve` a chatbot on a model using the `run` command. By default, it pulls from the ollama registry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/run/serve/g
% ramalama-list 1 | ||
|
||
## NAME | ||
ramalama - Simple management tool for working with AI Models |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/Simple management tool for working with AI Models/List all the AI Models in local storage/g
README.md
Outdated
You can `run` a chatbot on a model using the `run` command. By default, it pulls from the ollama registry. | ||
|
||
``` | ||
ramalama run merlonite |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't exist for me, granite-code is an option
I did find a bug or something we have to enhance:
./ramalama pull instructlab/merlinite-7b-lab
this doesn't work, but it should, don't support non default namespaces like "instructlab"
README.md
Outdated
You can `serve` a chatbot on a model using the `run` command. By default, it pulls from the ollama registry. | ||
|
||
``` | ||
ramalama serve lama3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
./ramalama serve llama3
works with an extra l
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice.
Signed-off-by: Daniel J Walsh <[email protected]>
No description provided.