Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MCore FSDP2 support #11216

Open
wants to merge 56 commits into
base: main
Choose a base branch
from
Open

Add MCore FSDP2 support #11216

wants to merge 56 commits into from

Conversation

BoxiangW
Copy link
Collaborator

@BoxiangW BoxiangW commented Nov 7, 2024

What does this PR do ?

Add MCore FSDP2 support

Collection: [Note which collection this PR will affect]

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

  • You can potentially add a usage example below
# Add a code snippet demonstrating how to use this 

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

@BoxiangW BoxiangW self-assigned this Nov 8, 2024
@BoxiangW BoxiangW added feature request/PR for a new feature Run CICD labels Nov 8, 2024
@BoxiangW BoxiangW added Run CICD and removed Run CICD labels Nov 8, 2024
@BoxiangW BoxiangW added Run CICD and removed Run CICD labels Nov 8, 2024
@@ -193,6 +195,7 @@ def __init__(
ckpt_load_optimizer: bool = True,
ckpt_save_optimizer: bool = True,
ddp: Union[DDPLiteral, DistributedDataParallelConfig] = "megatron",
fsdp: bool = False,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we rename it torch_fsdp because we know we'll have mcore_fsdp eventually?

Copy link
Collaborator Author

@BoxiangW BoxiangW Nov 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think we can make this arg a string? ['mcore', 'torch'] or None.
Or expand this arg to fsdp defualts to false, and use_torch_fsdp defaults to false?
So:
fsdp=Ture, use_torch_fsdp=True -> torch_fsdp
fsdp=Ture -> mcore_fsdp (to be supported)
nothing -> ddp

Just trying to make the interface less confusing?

@BoxiangW BoxiangW marked this pull request as ready for review November 22, 2024 23:17
HuiyingLi and others added 20 commits December 20, 2024 15:22
…11631)

* Fix Optimizer & LR scheduler Resume

* fix unit test

Signed-off-by: Chen Cui <[email protected]>

* Apply isort and black reformatting

Signed-off-by: cuichenx <[email protected]>

* typo

Signed-off-by: Chen Cui <[email protected]>

* Fix consume samples

* Fix unit tests

* Apply isort and black reformatting

Signed-off-by: suiyoubi <[email protected]>

---------

Signed-off-by: Chen Cui <[email protected]>
Signed-off-by: cuichenx <[email protected]>
Signed-off-by: suiyoubi <[email protected]>
Co-authored-by: Chen Cui <[email protected]>
Co-authored-by: cuichenx <[email protected]>
Co-authored-by: suiyoubi <[email protected]>
* Add vlm inference

* Add init

* Apply isort and black reformatting

Signed-off-by: meatybobby <[email protected]>

* Add KV cache and xattn cache in inference

* Fix position id for KV cache

* Apply isort and black reformatting

Signed-off-by: meatybobby <[email protected]>

* Add doc string

* pylint fix

* Remove max_output_len in inference controller

* Modify generate script

* Apply isort and black reformatting

Signed-off-by: meatybobby <[email protected]>

* Rename wrapped model

* Rename var

---------

Signed-off-by: meatybobby <[email protected]>
Co-authored-by: meatybobby <[email protected]>
* Add slimpajama example

Signed-off-by: Hemil Desai <[email protected]>

* Apply isort and black reformatting

Signed-off-by: hemildesai <[email protected]>

* Fix

Signed-off-by: Hemil Desai <[email protected]>

* Fixes

Signed-off-by: Hemil Desai <[email protected]>

* Fixes

Signed-off-by: Hemil Desai <[email protected]>

* Fixes

Signed-off-by: Hemil Desai <[email protected]>

* Fix

Signed-off-by: Hemil Desai <[email protected]>

* Add notebook

Signed-off-by: Hemil Desai <[email protected]>

* Fix

Signed-off-by: Hemil Desai <[email protected]>

* Add basic pretraining notebook for slimpajama

Signed-off-by: Hemil Desai <[email protected]>

* Add docs for pretraining notebook

Signed-off-by: Hemil Desai <[email protected]>

* PR feedback

Signed-off-by: Hemil Desai <[email protected]>

* PR feedback

Signed-off-by: Hemil Desai <[email protected]>

* Pylint fixes

Signed-off-by: Hemil Desai <[email protected]>

* Apply isort and black reformatting

Signed-off-by: hemildesai <[email protected]>

* PR feedback

Signed-off-by: Hemil Desai <[email protected]>

* Update README

Signed-off-by: Hemil Desai <[email protected]>

* PR feedback

Signed-off-by: Hemil Desai <[email protected]>

---------

Signed-off-by: Hemil Desai <[email protected]>
Signed-off-by: hemildesai <[email protected]>
Co-authored-by: hemildesai <[email protected]>
* remove nemo1 docs

Signed-off-by: Chen Cui <[email protected]>

* fix link

Signed-off-by: Chen Cui <[email protected]>

---------

Signed-off-by: Chen Cui <[email protected]>
* Change peft merge model to bf16

Signed-off-by: HuiyingLi <[email protected]>

* Apply isort and black reformatting

Signed-off-by: HuiyingLi <[email protected]>

---------

Signed-off-by: HuiyingLi <[email protected]>
Signed-off-by: HuiyingLi <[email protected]>
Co-authored-by: HuiyingLi <[email protected]>
…11609)

* Add depth pruning (layer dropping) to megatron_gpt_prune.py

Signed-off-by: Keval Morabia <[email protected]>

* Apply isort and black reformatting

Signed-off-by: kevalmorabia97 <[email protected]>

---------

Signed-off-by: Keval Morabia <[email protected]>
Signed-off-by: kevalmorabia97 <[email protected]>
Co-authored-by: kevalmorabia97 <[email protected]>
* add torch dist support

Signed-off-by: dimapihtar <[email protected]>

* Apply isort and black reformatting

Signed-off-by: dimapihtar <[email protected]>

* add changes

Signed-off-by: dimapihtar <[email protected]>

* Apply isort and black reformatting

Signed-off-by: dimapihtar <[email protected]>

* revert changes

Signed-off-by: Dmytro Pykhtar <[email protected]>

* revert changes

Signed-off-by: Dmytro Pykhtar <[email protected]>

* add deprecation notes

Signed-off-by: Dmytro Pykhtar <[email protected]>

* Apply isort and black reformatting

Signed-off-by: dimapihtar <[email protected]>

* add readme

Signed-off-by: Dmytro Pykhtar <[email protected]>

* update readme

Signed-off-by: Dmytro Pykhtar <[email protected]>

* update readme

Signed-off-by: Dmytro Pykhtar <[email protected]>

* update readme

Signed-off-by: Dmytro Pykhtar <[email protected]>

* rename script

Signed-off-by: Dmytro Pykhtar <[email protected]>

* Apply isort and black reformatting

Signed-off-by: dimapihtar <[email protected]>

* update readme

Signed-off-by: Dmytro Pykhtar <[email protected]>

* fix style

Signed-off-by: dimapihtar <[email protected]>

* fix styling

Signed-off-by: dimapihtar <[email protected]>

* fix styling

Signed-off-by: dimapihtar <[email protected]>

* Apply isort and black reformatting

Signed-off-by: dimapihtar <[email protected]>

* remove unused import

Signed-off-by: dimapihtar <[email protected]>

---------

Signed-off-by: dimapihtar <[email protected]>
Signed-off-by: dimapihtar <[email protected]>
Signed-off-by: Dmytro Pykhtar <[email protected]>
Co-authored-by: dimapihtar <[email protected]>
Co-authored-by: Dmytro Pykhtar <[email protected]>
…_ASR.ipynb and ASR_CTC_Language_Finetuning.ipynb (#11675)

* Downgrading the 'datasets' package from 3.0.0 to 2.21.0 for Multilang_ASR.ipynb

Signed-off-by: Weiqing Wang <[email protected]>

* Downgrading the 'datasets' package from 3.0.0 to 2.21.0 for ASR_CTC_Language_Finetuning.ipynb

Signed-off-by: Weiqing Wang <[email protected]>

---------

Signed-off-by: Weiqing Wang <[email protected]>
…point context io.json (#11648)

* Utils to detect and drop deprecated arguments in io.json

Signed-off-by: Jan Lasek <[email protected]>

* Unit tests for drop_unexpected_params

Signed-off-by: Jan Lasek <[email protected]>

* Apply isort and black reformatting

Signed-off-by: janekl <[email protected]>

* Add copyright header

Signed-off-by: Jan Lasek <[email protected]>

---------

Signed-off-by: Jan Lasek <[email protected]>
Signed-off-by: janekl <[email protected]>
Co-authored-by: janekl <[email protected]>
* Remove trt_compile from __init__ as it triggers imports from nemo.utils

Signed-off-by: Jan Lasek <[email protected]>

* Get tokenizer for NeMo 2 from model.yaml using local SP or HF classes

Signed-off-by: Jan Lasek <[email protected]>

* Apply isort and black reformatting

Signed-off-by: janekl <[email protected]>

---------

Signed-off-by: Jan Lasek <[email protected]>
Signed-off-by: janekl <[email protected]>
Co-authored-by: janekl <[email protected]>
* Fix baichuan export

Signed-off-by: Chen Cui <[email protected]>

* update import

Signed-off-by: Chen Cui <[email protected]>

---------

Signed-off-by: Chen Cui <[email protected]>
* rename multimodal data module

* Apply isort and black reformatting

Signed-off-by: yashaswikarnati <[email protected]>

* fix long lengths

* fix lint issues

* fix long lint issues

---------

Signed-off-by: yashaswikarnati <[email protected]>
Co-authored-by: yashaswikarnati <[email protected]>
…Sortformer model (#11671)

* Fixing the device assignment issues during inference (test_batch)

Signed-off-by: taejinp <[email protected]>

* Removing the commented code lines

Signed-off-by: taejinp <[email protected]>

---------

Signed-off-by: taejinp <[email protected]>
* add timestamp support

Signed-off-by: kevinhu <[email protected]>

* Add unit test, fix a branch, and refactor.

Signed-off-by: kevinhu <[email protected]>

* Apply isort and black reformatting

Signed-off-by: kevinhu-nv <[email protected]>
Signed-off-by: kevinhu <[email protected]>

* Apply isort and black reformatting

Signed-off-by: kevinhu-nv <[email protected]>

---------

Signed-off-by: kevinhu <[email protected]>
Signed-off-by: kevinhu-nv <[email protected]>
Co-authored-by: kevinhu-nv <[email protected]>
* Make LinearAdapter a nn.Linear child to maintain ckpt structure

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* add _is_fsdp_v1 attribute

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* lora+fsdp

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* set precision=bf16 in nl.Trainer

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* add test

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* fix

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* fix

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* fix

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* Apply isort and black reformatting

Signed-off-by: akoumpa <[email protected]>

* add unit tests

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* Apply isort and black reformatting

Signed-off-by: akoumpa <[email protected]>

* Update docs

Signed-off-by: Alexandros Koumparoulis <[email protected]>

* Apply isort and black reformatting

Signed-off-by: akoumpa <[email protected]>

---------

Signed-off-by: Alexandros Koumparoulis <[email protected]>
Signed-off-by: akoumpa <[email protected]>
Co-authored-by: akoumpa <[email protected]>
Copy link
Contributor

github-actions bot commented Jan 4, 2025

This PR is stale because it has been open for 14 days with no activity. Remove stale label or comment or update or this will be closed in 7 days.

@github-actions github-actions bot added the stale label Jan 4, 2025
@BoxiangW BoxiangW removed the stale label Jan 8, 2025
Copy link
Contributor

This PR is stale because it has been open for 14 days with no activity. Remove stale label or comment or update or this will be closed in 7 days.

@github-actions github-actions bot added the stale label Jan 23, 2025
@BoxiangW BoxiangW removed the stale label Jan 23, 2025
Copy link
Contributor

beep boop 🤖: 🙏 The following files have warnings. In case you are familiar with these, please try helping us to improve the code base.


Your code was analyzed with PyLint. The following annotations have been identified:

************* Module nemo.lightning.megatron_parallel
nemo/lightning/megatron_parallel.py:261:0: C0301: Line too long (127/119) (line-too-long)
nemo/lightning/megatron_parallel.py:262:0: C0301: Line too long (140/119) (line-too-long)
nemo/lightning/megatron_parallel.py:263:0: C0301: Line too long (130/119) (line-too-long)
nemo/lightning/megatron_parallel.py:569:0: C0301: Line too long (129/119) (line-too-long)
nemo/lightning/megatron_parallel.py:576:0: C0301: Line too long (135/119) (line-too-long)
nemo/lightning/megatron_parallel.py:904:0: C0301: Line too long (137/119) (line-too-long)
nemo/lightning/megatron_parallel.py:1134:0: C0301: Line too long (136/119) (line-too-long)
nemo/lightning/megatron_parallel.py:1707:0: C0301: Line too long (128/119) (line-too-long)
nemo/lightning/megatron_parallel.py:1746:0: C0301: Line too long (146/119) (line-too-long)
nemo/lightning/megatron_parallel.py:75:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:76:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:78:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:113:0: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:117:0: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:329:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:353:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:379:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:405:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:541:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:584:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:588:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:656:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:691:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:697:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:703:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:710:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:717:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:751:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:759:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:775:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:802:0: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:814:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:836:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:845:4: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:874:8: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1400:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1575:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:1581:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1587:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1591:4: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1596:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:1601:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:1629:0: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1675:8: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1697:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:1770:0: C0115: Missing class docstring (missing-class-docstring)
nemo/lightning/megatron_parallel.py:1813:0: C0116: Missing function or method docstring (missing-function-docstring)
nemo/lightning/megatron_parallel.py:1827:0: C0116: Missing function or method docstring (missing-function-docstring)
************* Module nemo.lightning.pytorch.strategies.megatron_strategy
nemo/lightning/pytorch/strategies/megatron_strategy.py:143:0: C0301: Line too long (121/119) (line-too-long)
nemo/lightning/pytorch/strategies/megatron_strategy.py:315:4: C0116: Missing function or method docstring (missing-function-docstring)

-----------------------------------
Your code has been rated at 9.56/10

Mitigation guide:

  • Add sensible and useful docstrings to functions and methods
  • For trivial methods like getter/setters, consider adding # pylint: disable=C0116 inside the function itself
  • To disable multiple functions/methods at once, put a # pylint: disable=C0116 before the first and a # pylint: enable=C0116 after the last.

By applying these rules, we reduce the occurance of this message in future.

Thank you for improving NeMo's documentation!

Copy link
Contributor

github-actions bot commented Feb 7, 2025

This PR is stale because it has been open for 14 days with no activity. Remove stale label or comment or update or this will be closed in 7 days.

@github-actions github-actions bot added the stale label Feb 7, 2025
@BoxiangW BoxiangW removed the stale label Feb 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request/PR for a new feature Run CICD
Projects
None yet
Development

Successfully merging this pull request may close these issues.