Skip to yearly menu bar Skip to main content


Poster

Learning on graphs using Orthonormal Representation is Statistically Consistent

Rakesh Shivanna · Chiranjib Bhattacharyya

Level 2, room 210D

Abstract: Existing research \cite{reg} suggests that embedding graphs on a unit sphere can be beneficial in learning labels on the vertices of a graph. However the choice of optimal embedding remains an open issue. \emph{Orthonormal representation} of graphs, a class of embeddings over the unit sphere, was introduced by Lov\'asz \cite{lovasz_shannon}. In this paper, we show that there exists orthonormal representations which are statistically consistent over a large class of graphs, including power law and random graphs. This result is achieved by extending the notion of consistency designed in the inductive setting to graph transduction. As part of the analysis, we explicitly derive relationships between the Rademacher complexity measure and structural properties of graphs, such as the chromatic number. We further show the fraction of vertices of a graph $G$, on $n$ nodes, that need to be labelled for the learning algorithm to be consistent, also known as labelled sample complexity, is $ \Omega\left(\frac{\vartheta(G)}{n}\right)^{\frac{1}{4}}$ where $\vartheta(G)$ is the famous Lov\'asz~$\vartheta$ function of the graph. This, for the first time, relates labelled sample complexity to graph connectivity properties, such as the density of graphs. In the multiview setting, whenever individual views are expressed by a graph, it is a well known heuristic that a convex combination of Laplacians \cite{lap_mv1} tend to improve accuracy. The analysis presented here easily extends to Multiple graph transduction, and helps develop a sound statistical understanding of the heuristic, previously unavailable.

Live content is unavailable. Log in and register to view live content