Skip to yearly menu bar Skip to main content


Tutorial

Self-Supervised Learning: Self-Prediction and Contrastive Learning

Lilian Weng · Jong Wook Kim

Moderator s: Erin Grant · Alfredo Canziani

Virtual

Abstract:

Self-supervised learning is a great way to extract training signals from massive amounts of unlabelled data and to learn good representation to facilitate downstream tasks where it is expensive to collect task-specific labels. This tutorial will focus on two major approaches for self-supervised learning, self-prediction and contrastive learning. Self-prediction refers to self-supervised training tasks where the model learns to predict a portion of the available data from the rest. Contrastive learning is to learn a representation space in which similar data samples stay close to each other while dissimilar ones are far apart, by constructing similar and dissimilar pairs from the dataset. This tutorial will cover methods on both topics and across various applications including vision, language, video, multimodal, and reinforcement learning.

Chat is not available.
Schedule