Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use kernels from the kernel hub #2988

Merged
merged 25 commits into from
Feb 10, 2025
Merged

Use kernels from the kernel hub #2988

merged 25 commits into from
Feb 10, 2025

Conversation

danieldk
Copy link
Member

@danieldk danieldk commented Feb 3, 2025

What does this PR do?

Use hub kernels for paged attention, MoE, and quantization (Marlin, cutlass, etc.).

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@danieldk danieldk force-pushed the kernel-hub branch 4 times, most recently from fac14af to 3726ab7 Compare February 5, 2025 15:41
@@ -90,7 +90,7 @@ mkShell {

postVenvCreation = ''
unset SOURCE_DATE_EPOCH
( cd server ; python -m pip install --no-dependencies -e . )
( cd server ; python -m pip install --no-build-isolation --no-dependencies -e . )
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to avoid downloading torch as a build dependency (since without build isolation it is used from the environment).

@@ -230,3 +232,111 @@ def _pack_weight(
moe_weight.perm[expert] = weight.perm

return moe_weight


def fused_marlin_moe(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to keep moe on the kernel hub as close to vLLM as possible, so moved this with our own extensions here.

@@ -146,3 +159,110 @@ def _load_expert_weights_row(
assert all_weight is not None

return all_weight


def fused_moe(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to keep moe on the kernel hub as close to vLLM as possible, so moved this with our own extensions here.

@danieldk danieldk marked this pull request as ready for review February 5, 2025 17:28
@Narsil
Copy link
Collaborator

Narsil commented Feb 7, 2025

Something broke on intel-CPU ,it's most likely due to a bad import failing on ipex-cpu.

(The failure is in the values, because the pure CPU+transformers fallback takes over).

@danieldk danieldk requested a review from Narsil February 8, 2025 07:47
@danieldk
Copy link
Member Author

danieldk commented Feb 8, 2025

Something broke on intel-CPU ,it's most likely due to a bad import failing on ipex-cpu.

(The failure is in the values, because the pure CPU+transformers fallback takes over).

Fixed. All checks pass 🎉 .

Copy link
Collaborator

@Narsil Narsil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

"filename": "build/torch26-cxx98-cu118-x86_64-linux/moe/configs/E=8,N=4096,device_name=NVIDIA_H100_80GB_HBM3,dtype=fp8_w8a8.json",
"blob_id": "cc614e635ea57327c610ce79e99ae5339614f22e"
},

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Holy guacamole. Thatś a long file....

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can forego the per-file hashes later?

@danieldk danieldk merged commit 571ac9b into main Feb 10, 2025
20 checks passed
@danieldk danieldk deleted the kernel-hub branch February 10, 2025 18:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants