Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Table Representation Learning Workshop

Self-supervised Representation Learning from Random Data Projectors

Yi Sui · Tongzi Wu · Jesse Cresswell · Ga Wu · George Stein · Xiao Shi Huang · Xiaochen Zhang · Maksims Volkovs

Keywords: [ Tabular representation learning ] [ random data projections ] [ domain-agnostic representation learning ] [ Representation Learning ] [ Self-supervised learning ]

[ ] [ Project Page ]
 
presentation: Table Representation Learning Workshop
Fri 15 Dec 6:30 a.m. PST — 3:30 p.m. PST

Abstract:

Self-supervised representation learning SSRL has advanced considerably by exploiting the transformation invariance assumption under artificially designed data augmentations. While augmentation-based SSRL algorithms push the boundaries of performance in computer vision and natural language processing, they are often not directly applicable to other data modalities such as tabular and time-series data. This paper presents an SSRL approach that can be applied to these data modalities because it does not rely on augmentations or masking. Specifically, we show that high-quality data representations can be learned by reconstructing random data projections. We evaluate the proposed approach on real-world applications with tabular and time-series data. We show that it outperforms multiple state-of-the-art SSRL baselines and is competitive with methods built on domain-specific knowledge. Due to its wide applicability and strong empirical results, we argue that learning from randomness is a fruitful research direction worthy of attention and further study.

Chat is not available.