-
Notifications
You must be signed in to change notification settings - Fork 217
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Distill BLOOM - tentative 2 #354
base: main
Are you sure you want to change the base?
Distill BLOOM - tentative 2 #354
Conversation
megatron/model/transformer.py
Outdated
self.alibi = self.alibi.to(torch.bfloat16) | ||
else: | ||
self.alibi = None | ||
@torch.no_grad() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nuke this
@@ -30,6 +41,9 @@ def _get_params_for_weight_decay_optimization(modules): | |||
|
|||
weight_decay_params = {'params': []} | |||
no_weight_decay_params = {'params': [], 'weight_decay': 0.0} | |||
|
|||
modules = _filter_for_teacher_student(modules) | |||
|
|||
for module in modules: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we do the filtering on the for loop
- check before the fist if - if it is a Student module
- check also with
isinstance
instead of class name
megatron/optimizer/__init__.py
Outdated
|
||
for module in modules: | ||
for module_ in module.modules(): | ||
if "Student" in module_.__class__.__name__: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use isinstance
instead
# transformer layer | ||
if args.pp_partition_method is not None: | ||
partition_method = args.pp_partition_method | ||
else: | ||
partition_method = 'type:transformer' | ||
|
||
|
||
from deepspeed.runtime.pipe.topology import PipeModelDataParallelTopology |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @thomasw21
Tentative of applying teacher student using Megatron-DeepSpeed
WIP draft PR - not supposed to merge
cc @thomasw21