Wavelet-Based Masked Multiscale Reconstruction for PPG Foundation Models
Megha Thukral · Cyrus Tanade · Simon Lee · Juhyeon Lee · Sharanya Desai
Abstract
We introduce Masked Multiscale Reconstruction (MMR), a self-supervised pretraining framework for photoplethysmography (PPG) signals that leverages the discrete wavelet transform. MMR is pretrained on $\sim$18M unlabeled 10-second PPG segments collected from over $\sim$41K smartwatch users largely in naturalistic field settings. The pretraining task is defined to randomly mask out subsets of wavelet coefficients derived from multi-resolution decomposition of raw PPG signals and train the encoder to reconstruct them. This enables the model to capture patterns across scales from fine-grained waveform morphology to long-term temporal dynamics crucial for diverse downstream tasks. On 10 of 13 health-related tasks, MMR trained on large-scale wearable PPG data outperforms or matches state-of-the-art open-source PPG foundation models and other self-supervised baselines. An ablation study of wavelet design further underscores the value of wavelet-based representations, paving the way toward robust and generalizable PPG foundation models.
Chat is not available.
Successful Page Load