Timezone: »
We show that either explicitly or implicitly, various well-known graph-based models exhibit a common significant \emph{harmonic} structure in its target function -- the value of a vertex is approximately the weighted average of the values of its adjacent neighbors. Understanding of such structure and analysis of the loss defined over such structure help reveal important properties of the target function over a graph. In this paper, we show that the variation of the target function across a cut can be upper and lower bounded by the ratio of its harmonic loss and the cut cost. We use this to develop an analytical tool and analyze 5 popular models in graph-based learning: absorbing random walks, partially absorbing random walks, hitting times, pseudo-inverse of graph Laplacian, and eigenvectors of the Laplacian matrices. Our analysis well explains several open questions of these models reported in the literature. Furthermore, it provides theoretical justifications and guidelines for their practical use. Simulations on synthetic and real datasets support our analysis.
Author Information
Xiao-Ming Wu (Columbia University)
Zhenguo Li (Huawei Noah's Ark Lab, Hong Kong)
Shih-Fu Chang (Columbia University)
More from the Same Authors
-
2014 Poster: Discrete Graph Hashing »
Wei Liu · Cun Mu · Sanjiv Kumar · Shih-Fu Chang -
2014 Spotlight: Discrete Graph Hashing »
Wei Liu · Cun Mu · Sanjiv Kumar · Shih-Fu Chang -
2012 Poster: Learning with Partially Absorbing Random Walks »
Xiao-Ming Wu · Zhenguo Li · Shih-Fu Chang · John Wright · Anthony Man-Cho So -
2009 Poster: Fast Graph Laplacian Regularized Kernel Learning via Semidefinite–Quadratic–Linear Programming »
Xiao-Ming Wu · Anthony Man-Cho So · Zhenguo Li · Shuo-Yen Robert Li -
2009 Spotlight: Fast Graph Laplacian Regularized Kernel Learning via Semidefinite–Quadratic–Linear Programming »
Xiao-Ming Wu · Anthony Man-Cho So · Zhenguo Li · Shuo-Yen Robert Li