Timezone: »
Is a sample rich enough to determine, at least locally, the parameters of a neural network? To answer this question, we introduce a new local parameterization of a given deep ReLU neural network by fixing the values of some of its weights. This allows us to define local lifting operators whose inverses are charts of a smooth manifold of a high dimensional space. The function implemented by the deep ReLU neural network composes the local lifting with a linear operator which depends on the sample. We derive from this convenient representation a geometrical necessary and sufficient condition of local identifiability. Looking at tangent spaces, the geometrical condition provides: 1/ a sharp and testable necessary condition of identifiability and 2/ a sharp and testable sufficient condition of local identifiability. The validity of the conditions can be tested numerically using backpropagation and matrix rank computations.
Author Information
Joachim Bona-Pellissier (Université Paul Sabatier)
François Malgouyres (Université Toulouse Paul Sabatier Institut de Mathématiques de Toulouse)
Francois Bachoc (Institut de Mathématiques de Toulouse)
More from the Same Authors
-
2022 Poster: High-dimensional Additive Gaussian Processes under Monotonicity Constraints »
Andrés López-Lopera · Francois Bachoc · Olivier Roustant -
2022 Poster: A general approximation lower bound in $L^p$ norm, with applications to feed-forward neural networks »
El Mehdi Achour · Armand Foucault · Sébastien Gerchinovitz · François Malgouyres -
2021 Poster: Instance-Dependent Bounds for Zeroth-order Lipschitz Optimization with Error Certificates »
Francois Bachoc · Tom Cesari · Sébastien Gerchinovitz