Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Learning from Time Series for Health

Contrastive Pre-Training for Multimodal Medical Time Series

Aniruddh Raghu · Payal Chandak · Ridwan Alam · John Guttag · Collin Stultz


Abstract:

Clinical time series data are highly rich and provide significant information about a patient's physiological state. However, these time series can be complex to model, particularly when they consist of multimodal data measured at different resolutions. Most existing methods to learn representations of these data consider only tabular time series (e.g., lab measurements and vitals signs), and do not naturally extend to modelling a full, multimodal time series. In this work, we propose a contrastive pre-training strategy to learn representations of multimodal time series. We consider a setting where the time series contains sequences of (1) high-frequency electrocardiograms and (2) structured data from labs and vitals. We outline a strategy to generate augmentations of these data for contrastive learning, building on recent work in representation learning for medical data. We evaluate our method on a real-world dataset, finding it obtains improved or competitive performance when compared to baselines on two downstream tasks.

Chat is not available.