SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization

Amir Hertz · Or Perel · Raja Giryes · Olga Sorkine-hornung · Daniel Cohen-or


Keywords: [ Optimization ] [ Vision ] [ Machine Learning ] [ Deep Learning ] [ Representation Learning ]


Multilayer-perceptrons (MLP) are known to struggle learning functions of high-frequencies, and in particular, instances of wide frequency bands.We present a progressive mapping scheme for input signals of MLP networks, enabling them to better fit a wide range of frequencies without sacrificing training stability or requiring any domain specific preprocessing. We introduce Spatially Adaptive Progressive Encoding (SAPE) layers, which gradually unmask signal components with increasing frequencies as a function of time and space. The progressive exposure of frequencies is monitored by a feedback loop throughout the neural optimization process, allowing changes to propagate at different rates among local spatial portions of the signal space. We demonstrate the advantage of our method on variety of domains and applications: regression of low dimensional signals and images, representation learning of occupancy networks, and a geometric task of mesh transfer between 3D shapes.

Chat is not available.