`

Timezone: »

 
Poster
Residual Relaxation for Multi-view Representation Learning
Yifei Wang · Zhengyang Geng · Feng Jiang · Chuming Li · Yisen Wang · Jiansheng Yang · Zhouchen Lin

Tue Dec 07 08:30 AM -- 10:00 AM (PST) @ None #None

Multi-view methods learn representations by aligning multiple views of the same image and their performance largely depends on the choice of data augmentation. In this paper, we notice that some other useful augmentations, such as image rotation, are harmful for multi-view methods because they cause a semantic shift that is too large to be aligned well. This observation motivates us to relax the exact alignment objective to better cultivate stronger augmentations. Taking image rotation as a case study, we develop a generic approach, Pretext-aware Residual Relaxation (Prelax), that relaxes the exact alignment by allowing an adaptive residual vector between different views and encoding the semantic shift through pretext-aware learning. Extensive experiments on different backbones show that our method can not only improve multi-view methods with existing augmentations, but also benefit from stronger image augmentations like rotation.

Author Information

Yifei Wang (Peking University)
Zhengyang Geng (Peking University)
Feng Jiang (Peking University)
Chuming Li (University of Electronic Science and Technology of China)
Yisen Wang (Peking University)
Jiansheng Yang
Zhouchen Lin (Peking University)

More from the Same Authors