Premium Only Content

Sparse is Enough in Scaling Transformers (aka Terraformer) | ML Research Paper Explained
#scalingtransformers #terraformer #sparsity
Transformers keep pushing the state of the art in language and other domains, mainly due to their ability to scale to ever more parameters. However, this scaling has made it prohibitively expensive to run a lot of inference requests against a Transformer, both in terms of compute and memory requirements. Scaling Transformers are a new kind of architecture that leverage sparsity in the Transformer blocks to massively speed up inference, and by including additional ideas from other architectures, they create the Terraformer, which is both fast, accurate, and consumes very little memory.
OUTLINE:
0:00 - Intro & Overview
4:10 - Recap: Transformer stack
6:55 - Sparse Feedforward layer
19:20 - Sparse QKV Layer
43:55 - Terraformer architecture
55:05 - Experimental Results & Conclusion
Paper: https://arxiv.org/abs/2111.12763
Code: https://github.com/google/trax/blob/m...
Abstract:
Large Transformer models yield impressive results on many tasks, but are expensive to train, or even fine-tune, and so slow at decoding that their use and study becomes out of reach. We address this problem by leveraging sparsity. We study sparse variants for all layers in the Transformer and propose Scaling Transformers, a family of next generation Transformer models that use sparse layers to scale efficiently and perform unbatched decoding much faster than the standard Transformer as we scale up the model size. Surprisingly, the sparse layers are enough to obtain the same perplexity as the standard Transformer with the same number of parameters. We also integrate with prior sparsity approaches to attention and enable fast inference on long sequences even with limited memory. This results in performance competitive to the state-of-the-art on long text summarization.
Authors: Sebastian Jaszczur, Aakanksha Chowdhery, Afroz Mohiuddin, Łukasz Kaiser, Wojciech Gajewski, Henryk Michalewski, Jonni Kanerva
Links:
TabNine Code Completion (Referral): http://bit.ly/tabnine-yannick
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://discord.gg/4H8xxDF
BitChute: https://www.bitchute.com/channel/yann...
LinkedIn: https://www.linkedin.com/in/ykilcher
BiliBili: https://space.bilibili.com/2017636191
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannick...
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
5:35:46
StoneMountain64
6 hours agoREMATCH is PEAK gaming in 2025
59.6K2 -
LIVE
LFA TV
22 hours agoLFA TV ALL DAY STREAM - TUESDAY 7/8/25
754 watching -
30:00
Uncommon Sense In Current Times
2 hours ago $0.14 earnedTeaching Kids to Pray for Persecuted Christians | Children's Prayer Passport | Ryan Brown
3.44K -
16:19
Professor Gerdes Explains 🇺🇦
4 hours ago🚨 Trump Defied Putin...Or Was It All a LIE? (Proof!)
3.75K3 -
54:24
The Dr. Ardis Show
5 hours ago $4.86 earnedThe Dr. Ardis Show | The Great Tooth Deception w Heather Paul | Episode 06.20.2025
17.7K4 -
Phyllis Schlafly Eagles
1 day agoCollegians 2025: Building America's Golden Age
9.2K -
LIVE
GritsGG
10 hours agoWSOW Group Stage TODAY @2PM PST! Most Wins 2960+!
57 watching -
5:43:55
Viss
7 hours ago🔴LIVE - The PUBG Resurgence has Begun! - PUBG Battlegrounds
12.6K2 -
9:13
Freedom Frontline
2 hours agoBernie Sanders MELTS DOWN When Joe Rogan Calls Out Trump Facts
13.1K15 -
1:20:43
vivafrei
10 hours agoEpstein Fallout Continues! Terrorists Welcomed in Canada? Zohran Mamdani Failing Up? AND MORE!
180K65