Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: NeurIPS 2023 Workshop on Tackling Climate Change with Machine Learning: Blending New and Existing Knowledge Systems

Lightweight, Pre-trained Transformers for Remote Sensing Timeseries

Gabriel Tseng · Ruben Cartuyvels · Ivan Zvonkov · Mirali Purohit · David Rolnick · Hannah Kerner

[ ]
Sat 16 Dec 9:14 a.m. PST — 9:22 a.m. PST
 
presentation: NeurIPS 2023 Workshop on Tackling Climate Change with Machine Learning: Blending New and Existing Knowledge Systems
Sat 16 Dec 6:15 a.m. PST — 3:30 p.m. PST

Abstract:

Machine learning models for parsing remote sensing data have a wide range of societally relevant applications, but labels used to train these models can be difficult or impossible to acquire. This challenge has spurred research into self-supervised learning for remote sensing data. Current self-supervised learning approaches for remote sensing data draw significant inspiration from techniques applied to natural images. However, remote sensing data has important differences from natural images -- for example, the temporal dimension is critical for many tasks and data is collected from many complementary sensors. We show we can create significantly smaller performant models by designing architectures and self-supervised training techniques specifically for remote sensing data. We introduce the Pretrained Remote Sensing Transformer (Presto), a transformer-based model pre-trained on remote sensing pixel-timeseries data. Presto excels at a wide variety of globally distributed remote sensing tasks and performs competitively with much larger models while requiring far less compute. Presto can be used for transfer learning or as a feature extractor for simple models, enabling efficient deployment at scale.

Chat is not available.