Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Attributing Model Behavior at Scale (ATTRIB)

In Search of a Data Transformation that Accelerates Neural Field Training

Junwon Seo · Sangyoon Lee · Jaeho Lee


Abstract:

Neural field is a special type of neural network that represents a single datum. We study whether we can speed up the training of such neural networks, by fitting a transformed version of the target datum; one can recover the original signal by inverting back the signal represented by the trained neural field. We empirically find that very simple data transformations, such as color inversion or random pixel shuffling, can substantially speed up or slow down the training. In particular, to our surprise, we observe that an image with randomly shuffled pixels can be fit much faster, despite having a very large frequency.

Chat is not available.