Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better docker tags #12631

Open
robertvazan opened this issue Dec 29, 2024 · 1 comment
Open

Better docker tags #12631

robertvazan opened this issue Dec 29, 2024 · 1 comment
Assignees

Comments

@robertvazan
Copy link

Please use docker tags for versioning. It allows us to safely upgrade and downgrade to resolve version-specific issues and to identify the version that introduced the issue. Tagging with Ollama/llama.cpp/vLLM versions allows us to predict what we will get after an upgrade.

For example, image intelanalytics/ipex-llm-inference-cpp-xpu has 4 months old tag 2.1.0 and then only rolling tags latest and 2.2.0-SNAPSHOT. I have pulled latest about two months ago. If I pull now and find the new version broken, I will have no way to undo the upgrade. I can tag locally before upgrade as a workaround, but that will not help me when moving to a new computer. I cannot search the tag list on Docker Hub for hash of my image to identify my version. I don't even know whether it makes sense to upgrade, because there is no information about which tag contains what software.

Please add:

  • unchanging version tags at least once per month (e.g. 20241229 or 2.1.1)
  • alias for the last unchanging tag, either repurpose latest (with nightly for rolling tag) or add new stable tag
  • tags identifying upgrades of Ollama/llama.cpp/vLLM (e.g. ollama-0.5.4)
@liu-shaojun
Copy link
Contributor

liu-shaojun commented Jan 2, 2025

Thank you for your feedback! We understand the importance of versioned tags for safe upgrades and troubleshooting.

Currently, for vLLM, the intelanalytics/ipex-llm-serving-xpu Docker image is consistently updated with versioned tags, ranging from 2.2.0-b1 to 2.2.0-b11, ensuring incremental improvements and fixes. However, we acknowledge that Ollama/llama.cpp does not yet support stable version tags.

We plan to address this in the future by releasing stable tags for significant changes in Ollama/llama.cpp, providing better clarity and usability.

To meet your immediate needs, we can provide a 2.2.0b20250101 tag as an interim solution. Please let us know if this works for you, and feel free to share any additional suggestions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants