Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 4th Workshop on Self-Supervised Learning: Theory and Practice

WERank: Rank Degradation Prevention for Self-Supervised Learning via Weight Regularization

Ali Saheb Pasand · Reza Moravej · Mahdi Biparva · Ali Ghodsi


Abstract:

A common phenomenon in self-supervised learning is dimensional collapse (also known as rank degeneration), where the learned embeddings are mapped to a low dimensional subspace of the embedding space. Despite employing mechanisms to prevent dimensional collapse, previous self-supervised approaches have not succeeded in completely alleviating the problem. We propose WERank, a new regularizer on the weight parameters of the neural network encoder to prevent rank degeneration. Our regularization term can be applied on top of any existing self-supervised method without significant computational cost. We provide empirical and mathematical evidence to demonstrate the effectiveness of WERank in avoiding dimensional collapse.

Chat is not available.