Brain Networks Laboratory (Choe Lab)

[Facebook][Choe] Self-supervised learning at Facebook

Oct 12, 2021

Self-supervised learning at Facebook research.

Need to go beyond input-based self-supervised learning : must close the sensorimotor loop.

RL is in someway doing this already (self-play), but how can we combine these two very promising approaches?

quote

Yann LeCun

“Facebook Loves Self-Supervised Learning. Period.”

Nice article at Analytics India Magazine.

They got that right!

Though the history of SSL at FAIR started much earlier than 2018!

Since early 2014, there has been work on video prediction, Siamese nets (e.g. for learning embeddings and for face recognition), on GANs (DCGAN), and for text (e g. with FastText). Some of that work had its roots in work that took place much earlier, by people who later joined FAIR. A good example is the celebrated 2010 Collobert-Weston work on “NLP from Scratch”, which involved training contrastively with clean text and corrupted text.

https://analyticsindiamag.com/facebook-loves-self-supervised-learning-period


← Back to all articles         Quick Navigation:    Next:[ j ] – Prev:[ k ] – List:[ l ]