The Adam optimization algorithm has proven remarkably effective for optimization problems across machine learning and even traditional tasks in geometry processing. At the same time, the development of equivariant methods, which preserve their output under the action of rotation or some other transformation, has proven to be important for geometry problems across these domains. In this work, we observe that Adam — when treated as a function that maps initial conditions to optimized results — is not rotation equivariant for vector-valued parameters due to per-coordinate moment updates. This leads to significant artifacts and biases in practice. We propose to resolve this deficiency with VectorAdam, a simple modification which makes Adam rotation-equivariant by accounting for the vector structure of optimization variables. We demonstrate this approach on problems in machine learning and traditional geometric optimization, showing that equivariant VectorAdam resolves the artifacts and biases of traditional Adam when applied to vector-valued data, with equivalent or even improved rates of convergence.
Selena Zihan Ling (University of Toronto)
PhD Student in geometry processing. Interested in artistic tools and applications in architecture.
Nicholas Sharp (Department of Computer Science, University of Toronto)
Alec Jacobson (University of Toronto)
More from the Same Authors
2022 Poster: Breaking Bad: A Dataset for Geometric Fracture and Reassembly »
Silvia Sellán · Yun-Chun Chen · Ziyi Wu · Animesh Garg · Alec Jacobson
2020 Poster: Learning Deformable Tetrahedral Meshes for 3D Reconstruction »
Jun Gao · Wenzheng Chen · Tommy Xiang · Alec Jacobson · Morgan McGuire · Sanja Fidler
2019 Poster: Learning to Predict 3D Objects with an Interpolation-based Differentiable Renderer »
Wenzheng Chen · Huan Ling · Jun Gao · Edward Smith · Jaakko Lehtinen · Alec Jacobson · Sanja Fidler