Timezone: »

 
Invited Talk: Deep learning without weight transport
Timothy Lillicrap

Recent advances in machine learning have been made possible by employing the backpropagation-of-error algorithm. Backprop enables the delivery of detailed error feedback across multiple layers of representation to adjust synaptic weights, allowing us to effectively train even very large networks. Whether or not the brain employs similar deep learning algorithms remains contentious; how it might do so remains a mystery. In particular, backprop uses the weights in the forward pass of the network to precisely compute error feedback in the backward pass. This way of computing errors across multiple layers is fundamentally at odds with what we know about the local computations of brains. We will describe new proposals for biologically motivated learning algorithms that are as effective as backpropagation without requiring weight transport.

Author Information

Timothy Lillicrap (DeepMind & UCL)

More from the Same Authors