Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix handling of file_not_found errors #499

Merged
merged 1 commit into from
Dec 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion container-images/scripts/build_llama_and_whisper.sh
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ dnf_install() {
curl --retry 8 --retry-all-errors -o \
/etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-Official "$url"
rpm --import /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-Official
dnf install -y mesa-vulkan-drivers-23.3.3-102.el9 "${vulkan_rpms[@]}"
dnf install -y mesa-vulkan-drivers "${vulkan_rpms[@]}"
elif [ "$containerfile" = "asahi" ]; then
dnf copr enable -y @asahi/fedora-remix-branding
dnf install -y asahi-repos
Expand Down
2 changes: 1 addition & 1 deletion docs/ramalama.1.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Running in containers eliminates the need for users to configure the host system

RamaLama pulls AI Models from model registries. Starting a chatbot or a rest API service from a simple single command. Models are treated similarly to how Podman and Docker treat container images.

When both Podman and Docker are installed, RamaLama defaults to Podman, The `RAMALAMA_CONTAINER_ENGINE=docker` environment variable can override this behavior. When neither are installed RamaLama attempts to run the model with software on the local system.
When both Podman and Docker are installed, RamaLama defaults to Podman, The `RAMALAMA_CONTAINER_ENGINE=docker` environment variable can override this behaviour. When neither are installed RamaLama attempts to run the model with software on the local system.

Note:

Expand Down
30 changes: 17 additions & 13 deletions ramalama/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,17 +21,17 @@


file_not_found = """\
RamaLama requires the "%s" command to be installed on the host when running with --nocontainer.
RamaLama is designed to run AI Models inside of containers, where "%s" is already installed.
Either install a package containing the "%s" command or run the workload inside of a container.
"""
RamaLama requires the "%(cmd)s" command to be installed on the host when running with --nocontainer.
RamaLama is designed to run AI Models inside of containers, where "%(cmd)s" is already installed.
Either install a package containing the "%(cmd)s" command or run the workload inside of a container.
%(error)s"""

file_not_found_in_container = """\
RamaLama requires the "%s" command to be installed inside of the container.
RamaLama requires the "%(cmd)s" command to be installed inside of the container.
RamaLama requires the server application be installed in the container images.
Either install a package containing the "%s" command in the container or run
with the default RamaLama image.
"""
Either install a package containing the "%(cmd)s" command in the container or run
with the default RamaLama
$(error)s"""


class Model:
Expand Down Expand Up @@ -227,7 +227,7 @@ def exec_model_in_container(self, model_path, cmd_args, args):
dry_run(conman_args)
return True

exec_cmd(conman_args, debug=args.debug)
run_cmd(conman_args, debug=args.debug)
return True

def run(self, args):
Expand Down Expand Up @@ -295,8 +295,10 @@ def run(self, args):
exec_cmd(exec_args, args.debug, debug=args.debug)
except FileNotFoundError as e:
if in_container():
raise NotImplementedError(file_not_found_in_container % (exec_args[0], str(e).strip("'")))
raise NotImplementedError(file_not_found % (exec_args[0], exec_args[0], exec_args[0], str(e).strip("'")))
raise NotImplementedError(
file_not_found_in_container % {"cmd": exec_args[0], "error": str(e).strip("'")}
)
raise NotImplementedError(file_not_found % {"cmd": exec_args[0], "error": str(e).strip("'")})

def serve(self, args):
if hasattr(args, "name") and args.name:
Expand Down Expand Up @@ -355,8 +357,10 @@ def serve(self, args):
exec_cmd(exec_args, debug=args.debug)
except FileNotFoundError as e:
if in_container():
raise NotImplementedError(file_not_found_in_container % (exec_args[0], str(e).strip("'")))
raise NotImplementedError(file_not_found % (exec_args[0], exec_args[0], exec_args[0], str(e).strip("'")))
raise NotImplementedError(
file_not_found_in_container % {"cmd": exec_args[0], "error": str(e).strip("'")}
)
raise NotImplementedError(file_not_found % {"cmd": exec_args[0], "error": str(e).strip("'")})

def quadlet(self, model, args, exec_args):
quadlet = Quadlet(model, args, exec_args)
Expand Down
Loading