We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Currently, the fp16 optimizer and zero optimizer are implementing 'manual' mixed precision algorithm. There are more and more people that are using amp. Nvidia apex is also warning users of its fp16 optimizer (https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/fp16_optimizer.py#L20).
Can you add an amp-version of mixed precision implementation in zero optimizer? Thanks.
The text was updated successfully, but these errors were encountered:
this is a quite important feature to support AMP with ZeRO.
Sorry, something went wrong.
enable onetrace: --onetrace-args (deepspeedai#121)
b45b051
* enable onetrace: --onetrace-args * add warning and skip onetrace when enable oneprof
Hi teams, any plan to support amp O1 in ZeRO?
xiexbing
No branches or pull requests
Currently, the fp16 optimizer and zero optimizer are implementing 'manual' mixed precision algorithm. There are more and more people that are using amp. Nvidia apex is also warning users of its fp16 optimizer (https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/fp16_optimizer.py#L20).
Can you add an amp-version of mixed precision implementation in zero optimizer? Thanks.
The text was updated successfully, but these errors were encountered: