Brain Networks Laboratory (Choe Lab)

Pretrained transformers as universal computation engines

Mar 11, 2021

Pretrained transformers as universal computing engines!!

Pretrained on language and frozen (no further fine-tuning based on target domain), used in other sequence tasks in numerical computation, vision, protein folding, etc. Collaboration among UC Berkeley, Google, and Facebook.

Pretrained Transformers as Universal Computation Engines

Lu et al.: https://arxiv.org/abs/2103.05247


← Back to all articles         Quick Navigation:    Next:[ j ] – Prev:[ k ] – List:[ l ]