Timezone: »

Working hard to know your neighbor's margins: Local descriptor learning loss
Anastasiia Mishchuk · Dmytro Mishkin · Filip Radenovic · Jiri Matas

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #96 #None

We introduce a loss for metric learning, which is inspired by the Lowe's matching criterion for SIFT. We show that the proposed loss, that maximizes the distance between the closest positive and closest negative example in the batch, is better than complex regularization methods; it works well for both shallow and deep convolution network architectures. Applying the novel loss to the L2Net CNN architecture results in a compact descriptor named HardNet. It has the same dimensionality as SIFT (128) and shows state-of-art performance in wide baseline stereo, patch verification and instance retrieval benchmarks.

Author Information

Anastasiia Mishchuk (Szkocka Research Group, Ukraine)
Dmytro Mishkin (Czech Technical University in Prague)

PhD candidate at Czech Technical University in Prague. Co-founder of Clear Research Work in computer vision since 2012. Commercial research: developed core technology for madora.co — personalized, visual based shopping app at Clear Research. Academy research: developed MODS – state-of-art algorithm for wide baseline stereo matching and LSUV – state-of-art method for convolution neural networks initialization. Co-founder of Szkocka Research Group – Ukrainian open research community for computer science.

Filip Radenovic (Visual Recognition Group, CTU in Prague)
Jiri Matas (Czech Technical University)