Different Optimizers available implemented in pytorch
TODO :- DOCS To use Adam/W use pytorch implementation
TO use it in your runs
git clone https://github.com/IsNoobgrammer/Pytorch-Optimizers optims --quiet
After that
from optims.optim import Lion,CAME,Adafactor,SM3,Lilith
from optims.sophia import Sophia
from optims.loraplus import create_loraplus_params