Data, in the form of time-dependent sequential observations emerge in many key real-world problems, ranging from biological data, financial markets, weather forecasting to audio/video processing. However, despite the ubiquity of such data, most mainstream machine learning algorithms have been primarily developed for settings in which sample points are drawn i.i.d. from some (usually unknown) fixed distribution. While there exist algorithms designed to handle non-i.i.d. data, these typically assume specific parametric form for the data-generating distribution. Such assumptions may undermine the complex nature of modern data which can possess long-range dependency patterns, and for which we now have the computing power to discern. On the other extreme lie on-line learning algorithms that consider a more general framework without any distributional assumptions. However, by being purely-agnostic, common on-line algorithms may not fully exploit the stochastic aspect of time-series data.
This is the third instalment of time series workshop at NIPS and will build on the success of the previous events: NIPS 2015 Time Series Workshop and NIPS 2016 Time Series Workshop.
The goal of this workshop is to bring together theoretical and applied researchers interested in the analysis of time series and development of new algorithms to process sequential data. This includes algorithms for time series prediction, classification, clustering, anomaly and change point detection, correlation discovery, dimensionality reduction as well as a general theory for learning and comparing stochastic processes. We invite researchers from the related areas of batch and online learning, reinforcement learning, data analysis and statistics, econometrics, and many others to contribute to this workshop.
|09:00 AM||Introduction to Time Series Workshop (Openning remarks)|
|09:15 AM||Marco Cuturi: Soft-DTW, a differentiable loss for time series data (Invited Talk)|
|10:00 AM||Learning theory and algorithms for shapelets and other local features. Daiki Suehiro, Kohei Hatano, Eiji Takimoto, Shuji Yamamoto, Kenichi Bannai and Akiko Takeda. (Contributed talk)|
|10:15 AM||Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting. Yaguang Li, Rose Yu, Cyrus Shahabi and Yan Liu. (Contributed talk)|
|10:30 AM||Morning Coffee Break (Break)|
|11:00 AM||Panel discussion featureing Marco Cuturi (ENSAE / CREST), Claire Monteleoni (GWU), Karthik Sridharan (Cornell), Firdaus Janoos (Two Sigma) and Matthias Seeger (Amazon) (Panel Discussion)|
|11:45 AM||Poster Session|
|Jaleh Zand, Kun Tu, Michael (Tao-Yi) Lee, Ian Covert, Daniel Hernandez, Shina Ebrahimzadeh, Joanna Slawinska, Akara Supratak, Miao Lu, John Alberg, Dennis Shen, Serene Yeo, Hsing-Kuo K Pao, Kian Ming Adam Chai, Anish Agarwal, Dimitrios Giannakis, Muhammad Amjad|
|02:30 PM||DISCOVERING ORDER IN UNORDERED DATASETS: GENERATIVE MARKOV NETWORKS. Yao-Hung Hubert Tsai, Han Zhao, Nebojsa Jojic and Ruslan Salakhutdinov. (Contributed talk)|
|02:45 PM||Vitaly Kuznetsov: Kaggle web traffic time series forecasting competition: results and insights (Talk)|
|03:30 PM||Afternoon Coffee Break (Break)|
|04:00 PM||Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks. Víctor Campos, Brendan Jou, Xavier Giró-I-Nieto, Jordi Torres and Shih-Fu Chang. (Contributed talk)|
|04:15 PM||Karthik Sridharan: Online learning, Probabilistic Inequalities and the Burkholder Method (Invited Talk)|
|05:00 PM||Scalable Joint Models for Reliable Event Prediction. Hossein Soleimani, James Hensman and Suchi Saria. (Contributed talk)|
|05:15 PM||Claire Monteleoni: Algorithms for Climate Informatics: Learning from spatiotemporal data with both spatial and temporal non-stationarity (Invited Talk)|
|06:00 PM||An Efficient ADMM Algorithm for Structural Break Detection in Multivariate Time Series. Alex Tank, Emily Fox and Ali Shojaie. (Contributed talk)|
|06:15 PM||Conclusion and Awards (Concluding remarks and Awards)|