Expo Talk Panel
Room R06-R09 (level 2)

Abstract

This will be a 50 minute presentation covering a variety of work at the intersection of graph representation learning and artificial intelligence being done at Google. It will provide some general overview of graph neural networks and LLMs and then go into three areas that we think will be of interest to a general machine learning audience, including:

GNNs to Optimize AI Model Execution [1,2]. This will cover recent work on using learned cost models to improve compiler performance for AI models. Encoding of Graphs as Text for GenAI models [3]. This will cover insights on how best to encode structured data, such as graphs, for LLMs and other GenAI models. Using AI-focused Accelerators for Graph Representation Learning [4]. This will cover work on using hardware acceleration designed primarily for GenAI models for graph representation learning.

All presenters are experts currently working in this area.

References

[1] TpuGraphs: A Performance Prediction Dataset on Large Tensor Computational Graphs Phitchaya Mangpo Phothilimthana, Sami Abu-El-Haija, Kaidi Cao, Bahare Fatemi, Charith Mendis, Bryan Perozzi https://arxiv.org/pdf/2308.13490.pdf

[2] Learning Large Graph Property Prediction via Graph Segment Training Kaidi Cao, Phitchaya Mangpo Phothilimthana, Sami Abu-El-Haija, Dustin Zelle, Yanqi Zhou, Charith Mendis, Jure Leskovec, Bryan Perozzi https://arxiv.org/pdf/2305.12322.pdf

[3] Talk Like a Graph: Encoding Graphs for Large Language Models Bahare Fatemi, Jonathan Halcrow, Bryan Perozzi https://arxiv.org/pdf/2310.04560.pdf

[4] HUGE: Huge Unsupervised Graph Embeddings with TPUs Brandon A. Mayer, Anton Tsitsulin, Hendrik Fichtenberger, Jonathan Halcrow, Bryan Perozzi https://arxiv.org/pdf/2307.14490.pdf

Chat is not available.