From 1eec80ae5cb3fd656a49c04db77091e2ebcd04a7 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?V=C3=A1clav=20Pavl=C3=ADn?= Date: Tue, 10 Dec 2024 09:46:30 +0100 Subject: [PATCH] chore: add a link to available models doc --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 40b48eb..b6d5bb8 100644 --- a/README.md +++ b/README.md @@ -2,6 +2,8 @@ Hello, drians! 👋 Here's a guide to help you understand the minimum specs needed for running different LLMs. We've broken it down into two main categories: GPU-enabled nodes and CPU-only nodes. Since you can run your nodes on machines both with or without GPU. +You can find the actual model names in [Available Models](https://docs.dria.co/how-to/models/) in our docs. + - ## 🖥️ GPU-Enabled Nodes These specs are based on a system with 16 CPUs and 64GB RAM.