Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add man pages for ramalama commands #16

Merged
merged 1 commit into from
Jul 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ help:
@echo
@echo " - make build"
@echo
@echo "Build docs"
@echo
@echo " - make docs"
@echo
@echo "Install ramalama"
@echo
@echo " - make install"
Expand All @@ -22,11 +26,15 @@ help:
.PHONY:
install:
./install.sh

make -c docs install
.PHONY:
build:
./container_build.sh

.PHONY: docs
docs:
make -C docs

.PHONY:
test:
./ci.sh
Expand Down
25 changes: 24 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,37 @@ curl -fsSL https://raw.githubusercontent.com/containers/ramalama/main/install.sh

## Usage

### Listing Models

You can `list` all models pulled into local storage.

```
ramalama list
```
### Pulling Models

You can pull a model using the `pull` command. By default, it pulls from the ollama registry.
You can `pull` a model using the `pull` command. By default, it pulls from the ollama registry.

```
ramalama pull granite-code
```

### Running Models

You can `run` a chatbot on a model using the `run` command. By default, it pulls from the ollama registry.

```
ramalama run instructlab/merlinite-7b-lab
```

### Serving Models

You can `serve` a chatbot on a model using the `serve` command. By default, it pulls from the ollama registry.

```
ramalama serve llama3
```

## Diagram

```
Expand Down
33 changes: 33 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
SED=sed
# The 'sort' below is crucial: without it, 'make docs' behaves differently
# on the first run than on subsequent ones, because the generated .md
MANPAGES_SOURCE_DIR = source/markdown
MANPAGES_MD ?= $(sort $(wildcard $(MANPAGES_SOURCE_DIR)/*.md))
MANPAGES ?= $(MANPAGES_MD:%.md=%)
MANPAGES_DEST ?= $(subst markdown,man, $(subst source,build,$(MANPAGES)))
$(MANPAGES): OUTFILE=$(subst source/markdown,build/man,$@)
$(MANPAGES): %: %.md
GOMD2MAN ?= /usr/bin/go-md2man
MANPAGES_DEST ?= $(subst markdown,man, $(subst source,build,$(MANPAGES)))
SELINUXOPT ?= $(shell test -x /usr/sbin/selinuxenabled && selinuxenabled && echo -Z)

.PHONY: docdir
docdir:
mkdir -p build/man
$(MANPAGES): OUTFILE=$(subst source/markdown,build/man,$@)
$(MANPAGES): %: %.md docdir
@$(SED) -e 's/\((ramalama[^)]*\.md\(#.*\)\?)\)//g' \
-e 's/\[\(ramalama[^]]*\)\]/\1/g' \
-e 's/\[\([^]]*\)](http[^)]\+)/\1/g' \
-e 's;<\(/\)\?\(a\|a\s\+[^>]*\|sup\)>;;g' \
-e 's/\\$$/ /g' $< |\
$(GOMD2MAN) -out $(OUTFILE)

.PHONY: install
install: $(MANPAGES)
install ${SELINUXOPT} -d -m 755 $(DESTDIR)$(MANDIR)/man1
install ${SELINUXOPT} -m 644 $(filter %.1,$(MANPAGES_DEST)) $(DESTDIR)$(MANDIR)/man1

clean:
$(RM) -fr build

16 changes: 16 additions & 0 deletions docs/source/markdown/ramalama-list.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
% ramalama-list 1

## NAME
ramalama - List all the AI Models in local storage

## SYNOPSIS
**ramalama list** [*options*]

## DESCRIPTION
List all the AI Models in local storage

## SEE ALSO
**[ramalama(1)](ramalama.1.md)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
16 changes: 16 additions & 0 deletions docs/source/markdown/ramalama-pull.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
% ramalama-pull 1

## NAME
ramalama - Pull and AI Models into local storage

## SYNOPSIS
**ramalama pull** [*options*] *model* [*model*...]

## DESCRIPTION
Pull specified AI Models into local storage

## SEE ALSO
**[ramalama(1)](ramalama.1.md)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
17 changes: 17 additions & 0 deletions docs/source/markdown/ramalama-run.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
% ramalama-run 1

## NAME
ramalama - Run specified AI Model as a chatbot

## SYNOPSIS
**ramalama run** [*options*] *model*

## DESCRIPTION
Run specified AI Model as a chat bot. Ramalama pulls specified AI Model from
registry if it does not exist in local storage.

## SEE ALSO
**[ramalama(1)](ramalama.1.md)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
17 changes: 17 additions & 0 deletions docs/source/markdown/ramalama-serve.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
% ramalama-serve 1

## NAME
ramalama - Serve specified AI Model as an API server

## SYNOPSIS
**ramalama serve** [*options*] *model*

## DESCRIPTION
Serve specified AI Model as a chat bot. Ramalama pulls specified AI Model from
registry if it does not exist in local storage.

## SEE ALSO
**[ramalama(1)](ramalama.1.md)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
30 changes: 30 additions & 0 deletions docs/source/markdown/ramalama.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
% ramalama 1

## NAME
ramalama - Simple management tool for working with AI Models

## SYNOPSIS
**ramalama** [*options*] *command*

## DESCRIPTION
Ramalama : The goal of ramalama is to make AI even more boring.

**ramalama [GLOBAL OPTIONS]**

## COMMANDS

| Command | Description |
| ------------------------------------------------ | --------------------------------------------------------------------------- |
| [ramalama-list(1)](ramalama-list.1.md) | Liste all AI models in local storage. |
| [ramalama-pull(1)](ramalama-pull.1.md) | Pull AI Model from registry to local storage |
| [ramalama-run(1)](ramalama-run.1.md) | Run a chatbot on AI Model. |
| [ramalama-serve(1)](ramalama-serve.1.md)| Serve local AI Model as an API Service. |

## CONFIGURATION FILES


## SEE ALSO
**[podman(1)](https://github.com/containers/podman/blob/main/docs/podman.1.md)**)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
Loading