Timezone: »

Feature Restricted Group Dropout for Robust Electronic Health Record Predictions
Bret Nestor · Anna Goldenberg · Marzyeh Ghassemi
Event URL: https://openreview.net/forum?id=nAWAm3WMxn »

Recurrent neural networks are commonly applied to electronic health records to capture complex relationships and model clinically relevant outcomes. However, it is commonplace for the covariates in electronic health records to change distributions. This work extends restricted feature interactions in recurrent neural networks to address foreseeable and unexpected covariate shifts. We extend on the previous work by 1) Introducing a deterministic feature rotation so that hyperparameter tuning can search through all combinations of features, 2) Introduce a sub-network specific dropout to ablate the influence of entire features at output of the hidden network, and 3) Extend the feature restrictions to the GRU-D network, which has been shown to be a stronger baseline for covariate shift recovery. We show that feature restricted GRU-D's may be more robust to certain perturbations. Manual intervention was not needed to confer robustness. Despite this, the LSTM was still the best model in nearly 50\% of the cases.

Author Information

Bret Nestor (University of Toronto)
Anna Goldenberg (University of Toronto)
Marzyeh Ghassemi (MIT)

More from the Same Authors