Brain Networks Laboratory (Choe Lab)

Generating Neural Networks with Neural Networks

Jan 12, 2018

Generating Neural Networks with Neural Networks

Abstract: Hypernetworks are neural networks that transform a random input vector into weights for a specified target neural network. We formulate the hypernetwork training objective as a compromise between accuracy and diversity, where the diversity takes into account trivial symmetry transformations of the target network. We show that this formulation naturally arises as a relaxation of an optimistic probability distribution objective for the generated networks, and we explain how it is related to variational inference. We use multi-layered perceptrons to form the mapping from the low dimensional input random vector to the high dimensional weight space, and demonstrate how to reduce the number of parameters in this mapping by weight sharing. We perform experiments on a four layer convolutional target network which classifies MNIST images, and show that the generated weights are diverse and have interesting distributions.

https://arxiv.org/abs/1801.01952v1


← Back to all articles         Quick Navigation:    Next:[ j ] – Prev:[ k ] – List:[ l ]