Brain Networks Laboratory (Choe Lab)

[Uber][Stanley] Differentiable Plasticity: A New Method for Learning to Learn

Apr 11, 2018

Differentiable Plasticity: A New Method for Learning to Learn

Thomas Miconi, Jeff Clune, and Kenneth O. Stanley

quote

Neural networks, which underlie many of Uber’s machine learning systems, have proven highly successful in solving complex problems, including image recognition, language understanding, and game-playing. However, these networks are usually trained to a stopping point through gradient descent, which incrementally adjusts the connections of the network based on its performance over many trials. Once the training is complete, the network is fixed and the connections can no longer change; as a result, barring any later re-training (again requiring many examples), the network in effect stops learning at the moment training ends.

By contrast, biological brains exhibit plasticity—that is, the ability for connections between neurons to change continually and automatically throughout life, allowing animals to learn quickly and efficiently from ongoing experience. The levels of plasticity of different areas and connections in the brain are the result of millions of years of fine-tuning by evolution to allow efficient learning during the animal’s lifetime. The resultant ability to learn continually over life lets animals adapt to changing or unpredictable environments with very little additional data. We can quickly memorize patterns that we have never seen before or learn new behaviors from just a few trials in entirely novel situations. …

https://eng.uber.com/differentiable-plasticity/


← Back to all articles         Quick Navigation:    Next:[ j ] – Prev:[ k ] – List:[ l ]