`

( events)   Timezone: »  
Workshop
Tue Dec 14 06:00 AM -- 02:35 PM (PST)
Second Workshop on Quantum Tensor Networks in Machine Learning
Xiao-Yang Liu · Qibin Zhao · Ivan Oseledets · Yufei Ding · Guillaume Rabusseau · Khadijeh Najafi · Andrzej Cichocki · Masashi Sugiyama · Anwar Walid





Workshop Home Page

Quantum tensor networks in machine learning (QTNML) are envisioned to have great potential to advance AI technologies. Quantum machine learning [1][2] promises quantum advantages (potentially exponential speedups in training [3], quadratic improvements in learning efficiency [4]) over classical machine learning, while tensor networks provide powerful simulations of quantum machine learning algorithms on classical computers. As a rapidly growing interdisciplinary area, QTNML may serve as an amplifier for computational intelligence, a transformer for machine learning innovations, and a propeller for AI industrialization.

Tensor networks [5], a contracted network of factor core tensors, have arisen independently in several areas of science and engineering. Such networks appear in the description of physical processes and an accompanying collection of numerical techniques have elevated the use of quantum tensor networks into a variational model of machine learning. These techniques have recently proven ripe to apply to many traditional problems faced in deep learning [6,7,8]. More potential QTNML technologies are rapidly emerging, such as approximating probability functions, and probabilistic graphical models [9,10,11,12]. Quantum algorithms are typically described by quantum circuits (quantum computational networks) that are indeed a class of tensor networks, creating an evident interplay between classical tensor network contraction algorithms and executing tensor contractions on quantum processors. The modern field of quantum enhanced machine learning has started to utilize several tools from tensor network theory to create new quantum models of machine learning and to better understand existing ones. However, the topic of QTNML is relatively young and many open problems are still to be explored.

Opening Remarks (Opening)
Anima Anandkumar (Talk)
Anima Anandkumar (Q&A)
Anwar Walid (Talk)
Anwar Walid (Q&A)
Danilo P. Mandic (Talk)
Danilo P. Mandic (Q&A)
Nadav Cohen (Talk)
Nadav Cohen (Q&A)
Stefanos Kourtis (Talk)
Stefanos Kourtis (Q&A)
Coffee Break + Poster Session (GatherTown) (poster session)
Model based multi-agent reinforcement learning with tensor decompositions (Oral)
Improvements to gradient descent methods for quantum tensor network machine learning (Oral)
Tensor Rings for Learning Circular Hidden Markov Models (Oral)
ContracTN: A Tensor Network Library Designed for Machine Learning (Oral)
Tensor Ring Parametrized Variational Quantum Circuits for Large Scale Quantum Machine Learning (Oral)
Nonparametric tensor estimation with unknown permutations (Oral)
Bayesian Tensor Networks (Oral)
A Tensorized Spectral Attention Mechanism for Efficient Natural Language Processing (Oral)
Rademacher Random Projections with Tensor Networks (Oral)
Reinforcement Learning in Factored Action Spaces using Tensor Decompositions (Oral)
Towards a Trace-Preserving Tensor Network Representation of Quantum Channels (Oral)
Distributive Pre-training of Generative Modeling Using Matrix Product States (Oral)
Discussion Pannel
Closing Remarks (Closing)
ContracTN: A Tensor Network Library Designed for Machine Learning (Poster)
Codee: A Tensor Embedding Scheme for Binary Code Search (Poster)
Quantum Machine Learning for Earth Observation Images (Poster)
Graph-Tensor Singular Value Decomposition for Data Recovery (Poster)
Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework (Poster)
Born Machines for Periodic and Open XY Quantum Spin Chains (Poster)
Low-Rank Tensor Completion via Coupled Framelet Transform (Poster)
Rademacher Random Projections with Tensor Networks (Poster)
Spectral Tensor Layer for Model-Parallel Deep Neural Networks (Poster)
Nonparametric tensor estimation with unknown permutations (Poster)
Bayesian Latent Factor Model for Higher-order Data: an Extended Abstract (Poster)
Is Rank Minimization of the Essence to Learn Tensor Network Structure? (Poster)
Matrix product state for quantum-inspired feature extraction and compressed sensing (Poster)
Deep variational reinforcement learning by optimizing Hamiltonian equation (Poster)
Multiway Spherical Clustering via Degree-Corrected Tensor Block Models (Poster)
A Tensorized Spectral Attention Mechanism for Efficient Natural Language Processing (Poster)
Improvements to gradient descent methods for quantum tensor network machine learning (Poster)
Bayesian Tensor Networks (Poster)
DTAE: Deep Tensor Autoencoder for 3-D Seismic Data Interpolation (Poster)
Distributive Pre-training of Generative Modeling Using Matrix Product States (Poster)
Towards a Trace-Preserving Tensor Network Representation of Quantum Channels (Poster)
Model based multi-agent reinforcement learning with tensor decompositions (Poster)
Tensor Rings for Learning Circular Hidden Markov Models (Poster)
Reinforcement Learning in Factored Action Spaces using Tensor Decompositions (Poster)
Fully-Connected Tensor Network Decomposition (Poster)
QTN-VQC: An End-to-End Learning Framework for Quantum Neural Networks (Poster)
Tensor Ring Parametrized Variational Quantum Circuits for Large Scale Quantum Machine Learning (Poster)
High Performance Hierarchical Tucker Tensor Learning Using GPU Tensor Cores (Poster)