Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix] Require triton >= 3.0.0 to resolve issue with MoE and TP>1 #6304

Closed
wants to merge 1 commit into from

Conversation

tdoublep
Copy link
Member

Fixes #6103 via triton-lang/triton#4295

This would make #6140 redundant but let's see if bumping Triton causes any issues in CI.

Signed-off-by: Thomas Parnell <[email protected]>
@tdoublep
Copy link
Member Author

Hmm closing this for now since it seems to conflict with torch version currently being used:

INFO: pip is looking at multiple versions of torch to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install -r requirements-cuda.txt (line 7) and triton>=3.0.0 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested triton>=3.0.0
    torch 2.3.0 depends on triton==2.3.0; platform_system == "Linux" and platform_machine == "x86_64" and python_version < "3.12"

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

we will need to wait for torch to go to Triton 3.0.0 i guess

@tdoublep tdoublep closed this Jul 10, 2024
@comaniac
Copy link
Collaborator

To update this, we will need to wait for torch 3.0 to release, followed by xformers, vllm-flash-attn and flashinfer.

@jeejeelee
Copy link
Collaborator

I think #6140 can serve as a solution for triton <3.0.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: fused_moe_kernel compile bug
3 participants