This paper addresses the problem of time series forecasting for non-stationary signals and multiple future steps prediction. To handle this challenging task, we introduce DILATE (DIstortion Loss including shApe and TimE), a new objective function for training deep neural networks. DILATE aims at accurately predicting sudden changes, and explicitly incorporates two terms supporting precise shape and temporal change detection. We introduce a differentiable loss function suitable for training deep neural nets, and provide a custom back-prop implementation for speeding up optimization. We also introduce a variant of DILATE, which provides a smooth generalization of temporally-constrained Dynamic TimeWarping (DTW). Experiments carried out on various non-stationary datasets reveal the very good behaviour of DILATE compared to models trained with the standard Mean Squared Error (MSE) loss function, and also to DTW and variants. DILATE is also agnostic to the choice of the model, and we highlight its benefit for training fully connected networks as well as specialized recurrent architectures, showing its capacity to improve over state-of-the-art trajectory forecasting approaches.
Vincent LE GUEN (CNAM, Paris, France)
Nicolas THOME (Cnam (Conservatoire national des arts et métiers))
More from the Same Authors
2019 Poster: Addressing Failure Prediction by Learning Model Confidence »
Charles Corbière · Nicolas THOME · Avner Bar-Hen · Matthieu Cord · Patrick Pérez
2018 Poster: Revisiting Multi-Task Learning with ROCK: a Deep Residual Auxiliary Block for Visual Detection »
Taylor Mordan · Nicolas THOME · Gilles Henaff · Matthieu Cord