Timezone: »

Neural Similarity Learning
Weiyang Liu · Zhen Liu · James Rehg · Le Song

Thu Dec 12 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #145

Inner product-based convolution has been the founding stone of convolutional neural networks (CNNs), enabling end-to-end learning of visual representation. By generalizing inner product with a bilinear matrix, we propose the neural similarity which serves as a learnable parametric similarity measure for CNNs. Neural similarity naturally generalizes the convolution and enhances flexibility. Further, we consider the neural similarity learning (NSL) in order to learn the neural similarity adaptively from training data. Specifically, we propose two different ways of learning the neural similarity: static NSL and dynamic NSL. Interestingly, dynamic neural similarity makes the CNN become a dynamic inference network. By regularizing the bilinear matrix, NSL can be viewed as learning the shape of kernel and the similarity measure simultaneously. We further justify the effectiveness of NSL with a theoretical viewpoint. Most importantly, NSL shows promising performance in visual recognition and few-shot learning, validating the superiority of NSL over the inner product-based convolution counterparts.

Author Information

Weiyang Liu (Georgia Institute of Technology)
Zhen Liu (MILA, University of Montreal)
James Rehg (Georgia Tech)
Le Song (Georgia Institute of Technology)

More from the Same Authors