Timezone: »

 
Input Convex Gradient Networks
Jack Richter-Powell · Jonathan Lorraine · Brandon Amos
Event URL: https://arxiv.org/abs/2111.12187 »

The gradients of convex functions are expressive models of non-trivial vector fields. For example, the optimal transport map between any two measures on Euclidean spaces under the squared distance is realized as a convex gradients via Brenier's theorem, which is a key insight used in recent machine learning flow models. In this paper, we study how to model convex gradients by integrating a Jacobian-vector product parameterized by a neural network, which we call the Input Convex Gradient Network (ICGN). We theoretically study ICGNs and compare them to modeling the gradient by taking the derivative of an input-convex neural network, demonstrating that ICGNs can efficiently parameterize convex gradients.

Author Information

Jack Richter-Powell (McGill University)
Jonathan Lorraine (University Of Toronto)
Brandon Amos (Carnegie Mellon University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors