Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can you support automatic mixed precision (amp) in zero optimizer? #121

Open
LiweiPeng opened this issue Mar 6, 2020 · 2 comments
Open
Assignees
Labels
enhancement New feature or request

Comments

@LiweiPeng
Copy link

Currently, the fp16 optimizer and zero optimizer are implementing 'manual' mixed precision algorithm. There are more and more people that are using amp. Nvidia apex is also warning users of its fp16 optimizer (https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/fp16_optimizer.py#L20).

Can you add an amp-version of mixed precision implementation in zero optimizer? Thanks.

@ShadenSmith ShadenSmith added the enhancement New feature or request label Mar 26, 2020
@rosrad
Copy link

rosrad commented Jun 27, 2021

this is a quite important feature to support AMP with ZeRO.

delock pushed a commit to delock/DeepSpeedSYCLSupport that referenced this issue Nov 11, 2022
* enable onetrace: --onetrace-args

* add warning and skip onetrace when enable oneprof
@xingchensong
Copy link

Hi teams, any plan to support amp O1 in ZeRO?

@xiexbing xiexbing self-assigned this Sep 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants