Skip to yearly menu bar Skip to main content


Tutorial

Message Passing In Machine Learning

Wee Sun Lee

Moderator s: Sinead Williamson · Mohammad Emtiyaz Khan

Virtual

Abstract:

Message passing algorithms are distributed algorithms that operate on graphs, where each node uses only information present locally at the node and incident edges, and send information only to its neighbouring nodes. They are often highly effective in machine learning and are relatively easy to parallelise. Examples include approximate inference algorithms on probabilistic graphical models, the value iteration algorithm for Markov decision process, graph neural networks and attention networks.

This tutorial presents commonly used approximate inference algorithms for probabilistic graphical models and the value iteration algorithm for Markov decision process, focusing on understanding the objectives that the algorithms are optimising for. We then consider more flexible but less interpretable message passing algorithms including graph neural networks and attention networks. We discuss how these more flexible networks can simulate the more interpretable algorithms, providing some understanding of the inductive biases of these networks through algorithmic alignment and allowing the understanding to be used for network design.

Chat is not available.
Schedule