Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I use sharp with Pytorch(Tensor Parallel) #165

Open
47oo opened this issue Sep 18, 2024 · 0 comments
Open

How can I use sharp with Pytorch(Tensor Parallel) #165

47oo opened this issue Sep 18, 2024 · 0 comments

Comments

@47oo
Copy link

47oo commented Sep 18, 2024

My sharp version is 3.5
When I run NCCL+Sharp with TP(https://pytorch.org/docs/stable/distributed.tensor.parallel.html ),it turns out that with TP the model is divided into two and each requests its own set of resources, and a conflict occurs sharp + nccl.How to use SHARP with NCCL properly with tensor parallelism factor equal 2?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant