Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Fri Dec 09 11:00 PM -- 09:30 AM (PST) @ Area 5 + 6
Learning with Tensors: Why Now and How?
Anima Anandkumar · Rong Ge · Yan Liu · Maximilian Nickel · Qi (Rose) Yu





Workshop Home Page

Real world data in many domains is multimodal and heterogeneous, such as healthcare, social media, and climate science. Tensors, as generalizations of vectors and matrices, provide a natural and scalable framework for handling data with inherent structures and complex dependencies. Recent renaissance of tensor methods in machine learning ranges from academic research on scalable algorithms for tensor operations, novel models through tensor representations, to industry solutions including Google TensorFlow and Tensor Processing Unit (TPU). In particular, scalable tensor methods have attracted considerable amount of attention, with successes in a series of learning tasks, such as learning latent variable models [Anandkumar et al., 2014; Huang et al., 2015, Ge et al., 2015], relational learning [Nickle et al., 2011, 2014, 2016], spatio-temporal forecasting [Yu et al., 2014, 2015, 2016] and training deep neural networks [Alexander et al., 2015].

These progresses trigger new directions and problems towards tensor methods in machine learning. The workshop aims to foster discussion, discovery, and dissemination of research activities and outcomes in this area and encourages breakthroughs. We will bring together researchers in theories and applications who are interested in tensors analysis and development of tensor-based algorithms. We will also invite researchers from related areas, such as numerical linear algebra, high-performance computing, deep learning, statistics, data analysis, and many others, to contribute to this workshop. We believe that this workshop can foster new directions, closer collaborations and novel applications. We also expect a deeper conversation regarding why learning with tensors at current stage is important, where it is useful, what tensor computation softwares and hardwares work well in practice and, how we can progress further with interesting research directions and open problems.

Opening Remarks
On Depth Efficiency of Convolutional Networks: the use of Hierarchical Tensor Decomposition for Network Design and Analysis (Keynote)
Contributed Talks (Talk)
Poster Spotlight 1 (Poster)
Coffee Break and Poster Session 1 (Break)
Tensor Network Ranks (Keynote)
Keynote Speech by Jimeng Sun (Keynote)
Computational Phenotyping using Tensor Factorization (Keynote)
Lunch (Break)
Orthogonalized Alternating Least Squares: A theoretically principled tensor factorization algorithm for practical use (Keynote)
Poster Spotlight (Poster)
Poster Spotlight 2 (Poster)
Coffee Break and Poster Session 2 (Break)
Coffee Break and Poster Session (Break)
Tensor decompositions for big multi-aspect data analytics (Keynote)
PhD Symposium (Talk)
Panel Discussion and Closing Remarks (Panel)