`

( events)   Timezone: »  
Workshop
Tue Dec 14 06:00 AM -- 02:35 PM (PST)
Second Workshop on Quantum Tensor Networks in Machine Learning
Xiao-Yang Liu · Qibin Zhao · Ivan Oseledets · Yufei Ding · Guillaume Rabusseau · Jean Kossaifi · Khadijeh Najafi · Anwar Walid · Andrzej Cichocki · Masashi Sugiyama





Quantum tensor networks in machine learning (QTNML) are envisioned to have great potential to advance AI technologies. Quantum machine learning [1][2] promises quantum advantages (potentially exponential speedups in training [3], quadratic improvements in learning efficiency [4]) over classical machine learning, while tensor networks provide powerful simulations of quantum machine learning algorithms on classical computers. As a rapidly growing interdisciplinary area, QTNML may serve as an amplifier for computational intelligence, a transformer for machine learning innovations, and a propeller for AI industrialization.

Tensor networks [5], a contracted network of factor core tensors, have arisen independently in several areas of science and engineering. Such networks appear in the description of physical processes and an accompanying collection of numerical techniques have elevated the use of quantum tensor networks into a variational model of machine learning. These techniques have recently proven ripe to apply to many traditional problems faced in deep learning [6,7,8]. More potential QTNML technologies are rapidly emerging, such as approximating probability functions, and probabilistic graphical models [9,10,11,12]. Quantum algorithms are typically described by quantum circuits (quantum computational networks) that are indeed a class of tensor networks, creating an evident interplay between classical tensor network contraction algorithms and executing tensor contractions on quantum processors. The modern field of quantum enhanced machine learning has started to utilize several tools from tensor network theory to create new quantum models of machine learning and to better understand existing ones. However, the topic of QTNML is relatively young and many open problems are still to be explored.

Opening Remarks (Opening)
Efficient Quantum Optimization via Multi-Basis Encodings and Tensor Rings (Talk)
Anima Anandkumar (Q&A)
High Performance Computation for Tensor Networks Learning (Talk)
Anwar Walid (Q&A)
Multi-graph Tensor Networks: Big Data Analytics on Irregular Domains (Talk)
Danilo P. Mandic (Q&A)
Implicit Regularization in Quantum Tensor Networks (Talk)
Nadav Cohen (Q&A)
Stefanos Kourtis (Talk)
Stefanos Kourtis (Q&A)
Coffee Break + Poster Session (GatherTown) (poster session)
Model based multi-agent reinforcement learning with tensor decompositions (Oral)
Improvements to gradient descent methods for quantum tensor network machine learning (Oral)
Tensor Rings for Learning Circular Hidden Markov Models (Oral)
ContracTN: A Tensor Network Library Designed for Machine Learning (Oral)
Tensor Ring Parametrized Variational Quantum Circuits for Large Scale Quantum Machine Learning (Oral)
Nonparametric tensor estimation with unknown permutations (Oral)
Bayesian Tensor Networks (Oral)
A Tensorized Spectral Attention Mechanism for Efficient Natural Language Processing (Oral)
Rademacher Random Projections with Tensor Networks (Oral)
Reinforcement Learning in Factored Action Spaces using Tensor Decompositions (Oral)
Towards a Trace-Preserving Tensor Network Representation of Quantum Channels (Oral)
Distributive Pre-training of Generative Modeling Using Matrix Product States (Oral)
Discussion Pannel
Closing Remarks (Closing)
Codee: A Tensor Embedding Scheme for Binary Code Search (Poster)
ContracTN: A Tensor Network Library Designed for Machine Learning (Poster)
Reinforcement Learning in Factored Action Spaces using Tensor Decompositions (Poster)
Nonparametric tensor estimation with unknown permutations (Poster)
Bayesian Latent Factor Model for Higher-order Data: an Extended Abstract (Poster)
A Tensorized Spectral Attention Mechanism for Efficient Natural Language Processing (Poster)
Matrix product state for quantum-inspired feature extraction and compressed sensing (Poster)
Born Machines for Periodic and Open XY Quantum Spin Chains (Poster)
Quantum Machine Learning for Earth Observation Images (Poster)
Fully-Connected Tensor Network Decomposition (Poster)
Distributive Pre-training of Generative Modeling Using Matrix Product States (Poster)
Deep variational reinforcement learning by optimizing Hamiltonian equation (Poster)
Towards a Trace-Preserving Tensor Network Representation of Quantum Channels (Poster)
DTAE: Deep Tensor Autoencoder for 3-D Seismic Data Interpolation (Poster)
QTN-VQC: An End-to-End Learning Framework for Quantum Neural Networks (Poster)
Tensor Ring Parametrized Variational Quantum Circuits for Large Scale Quantum Machine Learning (Poster)
Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework (Poster)
Bayesian Tensor Networks (Poster)
Graph-Tensor Singular Value Decomposition for Data Recovery (Poster)
Multiway Spherical Clustering via Degree-Corrected Tensor Block Models (Poster)
High Performance Hierarchical Tucker Tensor Learning Using GPU Tensor Cores (Poster)
Model based multi-agent reinforcement learning with tensor decompositions (Poster)
Is Rank Minimization of the Essence to Learn Tensor Network Structure? (Poster)
Spectral Tensor Layer for Model-Parallel Deep Neural Networks (Poster)
Low-Rank Tensor Completion via Coupled Framelet Transform (Poster)
Rademacher Random Projections with Tensor Networks (Poster)
Tensor Rings for Learning Circular Hidden Markov Models (Poster)
Improvements to gradient descent methods for quantum tensor network machine learning (Poster)