Workshop: First Workshop on Quantum Tensor Networks in Machine Learning
Xiao-Yang Liu, Qibin Zhao, Jacob Biamonte, Cesar F Caiafa, Paul Pu Liang, Nadav Cohen, Stefan Leichenauer
2020-12-11T08:00:00-08:00 - 2020-12-11T19:00:00-08:00
Abstract: Quantum tensor networks in machine learning (QTNML) are envisioned to have great potential to advance AI technologies. Quantum machine learning promises quantum advantages (potentially exponential speedups in training, quadratic speedup in convergence, etc.) over classical machine learning, while tensor networks provide powerful simulations of quantum machine learning algorithms on classical computers. As a rapidly growing interdisciplinary area, QTNML may serve as an amplifier for computational intelligence, a transformer for machine learning innovations, and a propeller for AI industrialization.
Tensor networks, a contracted network of factor tensors, have arisen independently in several areas of science and engineering. Such networks appear in the description of physical processes and an accompanying collection of numerical techniques have elevated the use of quantum tensor networks into a variational model of machine learning. Underlying these algorithms is the compression of high-dimensional data needed to represent quantum states of matter. These compression techniques have recently proven ripe to apply to many traditional problems faced in deep learning. Quantum tensor networks have shown significant power in compactly representing deep neural networks, and efficient training and theoretical understanding of deep neural networks. More potential QTNML technologies are rapidly emerging, such as approximating probability functions, and probabilistic graphical models. However, the topic of QTNML is relatively young and many open problems are still to be explored.
Quantum algorithms are typically described by quantum circuits (quantum computational networks). These networks are indeed a class of tensor networks, creating an evident interplay between classical tensor network contraction algorithms and executing tensor contractions on quantum processors. The modern field of quantum enhanced machine learning has started to utilize several tools from tensor network theory to create new quantum models of machine learning and to better understand existing ones.
The interplay between tensor networks, machine learning and quantum algorithms is rich. Indeed, this interplay is based not just on numerical methods but on the equivalence of tensor networks to various quantum circuits, rapidly developing algorithms from the mathematics and physics communities for optimizing and transforming tensor networks, and connections to low-rank methods for learning. A merger of tensor network algorithms with state-of-the-art approaches in deep learning is now taking place. A new community is forming, which this workshop aims to foster.
Tensor networks, a contracted network of factor tensors, have arisen independently in several areas of science and engineering. Such networks appear in the description of physical processes and an accompanying collection of numerical techniques have elevated the use of quantum tensor networks into a variational model of machine learning. Underlying these algorithms is the compression of high-dimensional data needed to represent quantum states of matter. These compression techniques have recently proven ripe to apply to many traditional problems faced in deep learning. Quantum tensor networks have shown significant power in compactly representing deep neural networks, and efficient training and theoretical understanding of deep neural networks. More potential QTNML technologies are rapidly emerging, such as approximating probability functions, and probabilistic graphical models. However, the topic of QTNML is relatively young and many open problems are still to be explored.
Quantum algorithms are typically described by quantum circuits (quantum computational networks). These networks are indeed a class of tensor networks, creating an evident interplay between classical tensor network contraction algorithms and executing tensor contractions on quantum processors. The modern field of quantum enhanced machine learning has started to utilize several tools from tensor network theory to create new quantum models of machine learning and to better understand existing ones.
The interplay between tensor networks, machine learning and quantum algorithms is rich. Indeed, this interplay is based not just on numerical methods but on the equivalence of tensor networks to various quantum circuits, rapidly developing algorithms from the mathematics and physics communities for optimizing and transforming tensor networks, and connections to low-rank methods for learning. A merger of tensor network algorithms with state-of-the-art approaches in deep learning is now taking place. A new community is forming, which this workshop aims to foster.
Chat
To ask questions please use rocketchat, available only upon registration and login.
Schedule
2020-12-11T08:00:00-08:00 - 2020-12-11T08:05:00-08:00
Opening Remarks
Xiao-Yang Liu
2020-12-11T08:05:00-08:00 - 2020-12-11T08:37:00-08:00
Talk 1: Expressiveness in Deep Learning via Tensor Networks and Quantum Entanglement
Nadav Cohen
2020-12-11T08:37:00-08:00 - 2020-12-11T08:45:00-08:00
Talk 1 Q&A
2020-12-11T08:45:00-08:00 - 2020-12-11T09:30:00-08:00
Talk 2: TBD (By Prof. Anima)
Animashree Anandkumar
2020-12-11T09:30:00-08:00 - 2020-12-11T10:15:00-08:00
Talk 3: Quantum in ML and ML in Quantum
Ivan Oseledets
2020-12-11T10:15:00-08:00 - 2020-12-11T10:25:00-08:00
Talk 3 Q&A
2020-12-11T10:25:00-08:00 - 2020-12-11T10:52:00-08:00
Talk 4: A Century of the Tensor Network Formulation from the Ising Model
Tomotoshi Nishino
2020-12-11T10:50:00-08:00 - 2020-12-11T11:30:00-08:00
Talk 5: Getting Started with Tensor Networks
Glen Evenbly
2020-12-11T11:30:00-08:00 - 2020-12-11T12:10:00-08:00
Talk 6: Tensor Network Models for Structured Data
Guillaume Rabusseau
2020-12-11T12:10:00-08:00 - 2020-12-11T13:10:00-08:00
Panel Discussion 1: Theoretical, Algorithmic and Physical
Jacob Biamonte, Xiao-Yang Liu, Nadav Cohen, Martin Ganahl, Glen Evenbly, Ivan Oseledets, Paul Springer
2020-12-11T12:10:00-08:00 - None
Panel Discussion 2: Software and High Performance Implementation
2020-12-11T13:10:00-08:00 - 2020-12-11T13:50:00-08:00
Talk 7: cuTensor: High-Performance CUDA Tensor Primitives
Paul Springer
2020-12-11T13:50:00-08:00 - 2020-12-11T14:30:00-08:00
Talk 8: TensorNetwork: A Python Package for Tensor Network Computations
Martin Ganahl
2020-12-11T14:30:00-08:00 - 2020-12-11T15:10:00-08:00
Talk 9: Tensor Methods for Efficient and Interpretable Spatiotemporal Learning
Rose Yu
2020-12-11T15:10:00-08:00 - 2020-12-11T15:50:00-08:00
Talk 10: Tensor Networks as a Data Structure in Probabilistic Modeling and for Learning Dynamical Laws from Data
Jens Eisert
2020-12-11T15:50:00-08:00 - 2020-12-11T16:30:00-08:00
Talk 11: Tensor networks and counting problems on the lattice
Frank Verstraete
2020-12-11T16:30:00-08:00 - 2020-12-11T16:42:00-08:00
Contributed Talk 1: Paper 3: Tensor network approaches for data-driven identification of non-linear dynamical laws
Alex Goeßmann
2020-12-11T16:42:00-08:00 - 2020-12-11T16:54:00-08:00
Contributed Talk 2: Paper 6: Anomaly Detections with Tensor Networks
Jensen Wang
2020-12-11T16:54:00-08:00 - 2020-12-11T17:06:00-08:00
Contributed Talk 3: Paper 19: Deep convolutional tensor network
Philip Blagoveschensky
2020-12-11T17:06:00-08:00 - 2020-12-11T17:18:00-08:00
Contributed Talk 4: Paper 27: Limitations of gradient-based Born Machine over tensornetworks on learning quantum nonlocality
Khadijeh Najafi
2020-12-11T17:18:00-08:00 - 2020-12-11T17:30:00-08:00
Contributed Talk 5: Paper 32: High-order Learning Model via Fractional Tensor Network Decomposition
Chao Li
2020-12-11T17:30:00-08:00 - 2020-12-11T18:10:00-08:00
Talk 12: Learning Quantum Channels with Tensor Networks
Giacomo Torlai
2020-12-11T18:10:00-08:00 - 2020-12-11T18:50:00-08:00
Talk 13: High Performance Computation for Tensor Networks Learning
Anwar Walid, Xiao-Yang Liu
2020-12-11T18:50:00-08:00 - 2020-12-11T19:00:00-08:00
Closing Remarks
Xiao-Yang Liu