5 Tweets 7 reads Feb 15, 2023
The Adam optimizer is at the heart of modern AI. Researchers have been trying to dethrone Adam for years.
How about we ask a machine to do a better job? @GoogleAI uses evolution to discover a simpler & efficient algorithm with remarkable features.
It’s just 8 lines of code: 🧡
The discovered β€œLion” optimizer is able to boost the accuracy of Vision Transformers (ViT) by up to 2% on ImageNet, reduce training compute by up to 2.3x for diffusion models, and achieve comparable performance on LLMs. It is more memory-efficient compared to human designs.
2/
Remarkably, the evolutionary search decides that the SIGN of gradient is all you need. For example, if the gradient is [-0.31, 0.43, -0.21], Lion turns it into [-1, 1, -1] for the update vector. This is counter-intuitive and nontrivial for human researchers to come up with.
3/
I want to highlight that this paper is a great demonstration of a *scalable* symbolic AI system. There are prior works that propose neural network-based, learned meta-optimizers. But Lion is a much simpler symbolic form that is interpretable and lightweight to incorporate.
4/
Paper: arxiv.org
Authors: @XiangningChen, @crazydonkey200, Da Huang, Esteban Real, Kaiyuan Wang, Yao Liu, Hieu Pham, Xuanyi Dong, Thang Luong, Cho-Jui Hsieh, Yifeng Lu, @quocleix.
Follow me for more deep dives ;)

Loading suggestions...