Timezone: »
Differentiation along algorithms, i.e., piggyback propagation of derivatives, is now routinely used to differentiate iterative solvers in differentiable programming. Asymptotics is well understood for many smooth problems but the nondifferentiable case is hardly considered. Is there a limiting object for nonsmooth piggyback automatic differentiation (AD)? Does it have any variational meaning and can it be used effectively in machine learning? Is there a connection with classical derivative? All these questions are addressed under appropriate contractivity conditions in the framework of conservative derivatives which has proved useful in understanding nonsmooth AD. For nonsmooth piggyback iterations, we characterize the attractor set of nonsmooth piggyback iterations as a set-valued fixed point which remains in the conservative framework. This has various consequences and in particular almost everywhere convergence of classical derivatives. Our results are illustrated on parametric convex optimization problems with forward-backward, Douglas-Rachford and Alternating Direction of Multiplier algorithms as well as the Heavy-Ball method.
Author Information
Jerome Bolte (Université Toulouse Capitole)
Edouard Pauwels (IRIT)
Samuel Vaiter (CNRS)
More from the Same Authors
-
2022 Poster: Benchopt: Reproducible, efficient and collaborative optimization benchmarks »
Thomas Moreau · Mathurin Massias · Alexandre Gramfort · Pierre Ablin · Pierre-Antoine Bannier · Benjamin Charlier · Mathieu Dagréou · Tom Dupre la Tour · Ghislain DURIF · Cassio F. Dantas · Quentin Klopfenstein · Johan Larsson · En Lai · Tanguy Lefort · Benoît Malézieux · Badr MOUFAD · Binh T. Nguyen · Alain Rakotomamonjy · Zaccharie Ramzi · Joseph Salmon · Samuel Vaiter -
2022 Poster: A framework for bilevel optimization that enables stochastic and global variance reduction algorithms »
Mathieu Dagréou · Pierre Ablin · Samuel Vaiter · Thomas Moreau -
2021 Poster: Semialgebraic Representation of Monotone Deep Equilibrium Models and Applications to Certification »
Tong Chen · Jean Lasserre · Victor Magron · Edouard Pauwels -
2021 Poster: Nonsmooth Implicit Differentiation for Machine-Learning and Optimization »
Jérôme Bolte · Tam Le · Edouard Pauwels · Tony Silveti-Falls -
2021 Poster: Numerical influence of ReLU’(0) on backpropagation »
David Bertoin · Jérôme Bolte · Sébastien Gerchinovitz · Edouard Pauwels -
2021 Poster: On the Universality of Graph Neural Networks on Large Random Graphs »
Nicolas Keriven · Alberto Bietti · Samuel Vaiter -
2020 Poster: Convergence and Stability of Graph Convolutional Networks on Large Random Graphs »
Nicolas Keriven · Alberto Bietti · Samuel Vaiter -
2020 Spotlight: Convergence and Stability of Graph Convolutional Networks on Large Random Graphs »
Nicolas Keriven · Alberto Bietti · Samuel Vaiter -
2016 Poster: Sorting out typicality with the inverse moment matrix SOS polynomial »
Edouard Pauwels · Jean Lasserre