Timezone: »

Parallel Double Greedy Submodular Maximization
Xinghao Pan · Stefanie Jegelka · Joseph Gonzalez · Joseph K Bradley · Michael Jordan

Wed Dec 10 04:00 PM -- 08:59 PM (PST) @ Level 2, room 210D

Many machine learning problems can be reduced to the maximization of submodular functions. Although well understood in the serial setting, the parallel maximization of submodular functions remains an open area of research with recent results only addressing monotone functions. The optimal algorithm for maximizing the more general class of non-monotone submodular functions was introduced by Buchbinder et al. and follows a strongly serial double-greedy logic and program analysis. In this work, we propose two methods to parallelize the double-greedy algorithm. The first, coordination-free approach emphasizes speed at the cost of a weaker approximation guarantee. The second, concurrency control approach guarantees a tight 1/2-approximation, at the quantifiable cost of additional coordination and reduced parallelism. As a consequence we explore the trade off space between guaranteed performance and objective optimality. We implement and evaluate both algorithms on multi-core hardware and billion edge graphs, demonstrating both the scalability and tradeoffs of each approach.

Author Information

Xinghao Pan (UC Berkeley)
Stefanie Jegelka (MIT)
Joseph Gonzalez (UC Berkeley)
Joseph K Bradley (University of California, Berkeley)
Michael Jordan (UC Berkeley)

More from the Same Authors