Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add AdaBelief optimizer [https://arxiv.org/abs/2010.07468] #2203

Closed
TimbusCalin opened this issue Oct 17, 2020 · 8 comments
Closed

Add AdaBelief optimizer [https://arxiv.org/abs/2010.07468] #2203

TimbusCalin opened this issue Oct 17, 2020 · 8 comments

Comments

@TimbusCalin
Copy link

TimbusCalin commented Oct 17, 2020

Describe the feature and the current behavior/state.

Relevant information

**Which API type would this fall under (layer, metric, optimizer, etc.): Optimizer/Layer

Who will benefit with this feature? Everybody; shows better results/more promising ones than the other optimizers

Any other info.

@AakashKumarNain
Copy link
Member

AakashKumarNain commented Oct 17, 2020

Although Adabelief looks interesting but we will wait for some time before adding it here. Why? Many optimizers showed improvements over Adam in the past as well but they failed to deliver that performance when tested on a wide variety of datasets. Once we get more insights about the performance, we will start the process of adding it to addons with a PR

@TimbusCalin
Copy link
Author

Yes, I agree with your statement.

@bhack
Copy link
Contributor

bhack commented Oct 17, 2020

Yes see a recent work like https://arxiv.org/abs/2007.01547

@bhack
Copy link
Contributor

bhack commented Oct 27, 2020

@dathudeptrai
Copy link

TF reference implementation at https://github.com/juntang-zhuang/Adabelief-Optimizer/blob/master/TensorFlow_Experiments/AdaBelief_tf.py

Seems that implementation missed many features, hope this optimizer included in tf_addons soon :D.

The adabelief-tf is a naive implementation in Tensorflow. It lacks many features such as decoupled weight decay, and is not extensively tested. Currently I don't have plans to improve it since I seldom use Tensorflow, please contact me if you want to collaborate and improve it.

@juntang-zhuang
Copy link
Contributor

juntang-zhuang commented Nov 4, 2020

TF reference implementation at https://github.com/juntang-zhuang/Adabelief-Optimizer/blob/master/TensorFlow_Experiments/AdaBelief_tf.py

Seems that implementation missed many features, hope this optimizer included in tf_addons soon :D.

The adabelief-tf is a naive implementation in Tensorflow. It lacks many features such as decoupled weight decay, and is not extensively tested. Currently I don't have plans to improve it since I seldom use Tensorflow, please contact me if you want to collaborate and improve it.

Hi, we have updated adabelief-tf==0.1.0 based on tensorflow-addons, which is also available by pip, and supports decoupled weight decay and rectification, and supports Tensorflow>=2.0 and Keras.
See the source code for adabelief-tf==0.1.0 here:
https://github.com/juntang-zhuang/Adabelief-Optimizer/blob/update_0.1.0/pypi_packages/adabelief_tf0.1.0/adabelief_tf/AdaBelief_tf.py
See toy examples on text classification task and a word embedding task.

@bhack
Copy link
Contributor

bhack commented Nov 10, 2020

This Is the first AdaBelief citation with code:
https://github.com/yuanwei2019/EAdam-optimizer

@seanpmorgan
Copy link
Member

TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision:
TensorFlow Addons Wind Down

Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA:
Keras
Keras-CV
Keras-NLP

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants