Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ViT Speedup #399

Merged
merged 2 commits into from
Jul 2, 2024
Merged

ViT Speedup #399

merged 2 commits into from
Jul 2, 2024

Conversation

benjijamorris
Copy link
Contributor

What does this PR do?

  • convert random numpy code to torch
  • initialize tensors on device for speedup

Before submitting

  • Did you make sure title is self-explanatory and the description concisely explains the PR?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you test your PR locally with pytest command?
  • Did you run pre-commit hooks with pre-commit run -a command?

Did you have fun?

Make sure you had fun coding 🙃

@benjijamorris benjijamorris requested a review from ritvikvasan July 2, 2024 17:07
ritvikvasan
ritvikvasan previously approved these changes Jul 2, 2024
Copy link
Member

@ritvikvasan ritvikvasan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! some minor comments but ignore if not useful

*[
Rearrange(
"(n_patch_y n_patch_x) b c -> b c n_patch_y n_patch_x",
n_patch_y=n_patches[1],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is n_patches here?

Reduce(
"b c n_patch_y n_patch_x -> b c (n_patch_y patch_size_y) (n_patch_x patch_size_x)",
reduction="repeat",
patch_size_y=patch_size[1],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is patch_size[1]?

cyto_dl/nn/vits/blocks/patchify.py Show resolved Hide resolved
@@ -90,18 +90,28 @@ def init_weight(self):

def forward(self, features, forward_indexes, backward_indexes):
# HACK TODO allow usage of multiple intermediate feature weights, this works when decoder is 0 layers
features = features.squeeze(0)
features = features[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is features[0]?

cyto_dl/nn/vits/cross_mae.py Show resolved Hide resolved
@benjijamorris benjijamorris merged commit 592177b into main Jul 2, 2024
4 of 6 checks passed
@benjijamorris benjijamorris deleted the feature/vit_speedup branch July 2, 2024 21:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants