Skip to yearly menu bar Skip to main content


Poster

[Re] Understanding Self-Supervised Learning Dynamics without Contrastive Pairs

Tobias Höppe · Agnieszka Miszkurka · Dennis Bogatov Wilkman

Keywords: [ ReScience - MLRC 2021 ] [ Journal Track ]


Abstract:

Self-Supervised learning without contrastive pairs has shown huge success in the recent year. However, understanding why these networks do not collapse despite not using contrastive pairs was not fully understood until very recently. In this work we re-implemented the architectures and pre-training schemes of SimSiam, BYOL, DirectPred and DirectCopy. We investigated the eigenspace alignment hypothesis in DirectPred, by plotting the eigenvalues and eigenspace alignments for both SimSiam and BYOL with and without Symmetric regularization. We also combine the framework of DirectPred with SimCLRv2 in order to explore if any further improvements could be made. We managed to achieve comparable results to the paper of DirectPred in regards to accuracy and the behaviour of symmetry and eigenspace alignment.

Chat is not available.