Timezone: »

Second Workshop on Quantum Tensor Networks in Machine Learning
Xiao-Yang Liu · Qibin Zhao · Ivan Oseledets · Yufei Ding · Guillaume Rabusseau · Khadijeh Najafi · Andrzej Cichocki · Masashi Sugiyama · Anwar Walid

Tue Dec 14 05:00 AM -- 05:00 PM (PST) @ None
Event URL: https://tensorworkshop.github.io/NeurIPS2021/index.html »

Quantum tensor networks in machine learning (QTNML) are envisioned to have great potential to advance AI technologies. Quantum machine learning [1][2] promises quantum advantages (potentially exponential speedups in training [3], quadratic improvements in learning efficiency [4]) over classical machine learning, while tensor networks provide powerful simulations of quantum machine learning algorithms on classical computers. As a rapidly growing interdisciplinary area, QTNML may serve as an amplifier for computational intelligence, a transformer for machine learning innovations, and a propeller for AI industrialization.

Tensor networks [5], a contracted network of factor core tensors, have arisen independently in several areas of science and engineering. Such networks appear in the description of physical processes and an accompanying collection of numerical techniques have elevated the use of quantum tensor networks into a variational model of machine learning. These techniques have recently proven ripe to apply to many traditional problems faced in deep learning [6,7,8]. More potential QTNML technologies are rapidly emerging, such as approximating probability functions, and probabilistic graphical models [9,10,11,12]. Quantum algorithms are typically described by quantum circuits (quantum computational networks) that are indeed a class of tensor networks, creating an evident interplay between classical tensor network contraction algorithms and executing tensor contractions on quantum processors. The modern field of quantum enhanced machine learning has started to utilize several tools from tensor network theory to create new quantum models of machine learning and to better understand existing ones. However, the topic of QTNML is relatively young and many open problems are still to be explored.

Author Information

Xiao-Yang Liu (Columbia University)
Qibin Zhao (RIKEN AIP)
Ivan Oseledets (Skolkovo Institute of Science and Technology)
Yufei Ding (UC Santa Barbara)
Guillaume Rabusseau (Mila - Université de Montréal)
Khadijeh Najafi (Harvard and Caltech)
Andrzej Cichocki (Skolkovo Institute of Science and Technology)
Masashi Sugiyama (RIKEN AIP/The University of Tokyo)
Anwar Walid (Columbia University)

More from the Same Authors