Skip to yearly menu bar Skip to main content


Poster
in
Workshop: INTERPOLATE — First Workshop on Interpolation Regularizers and Beyond

Sample Relationships through the Lens of Learning Dynamics with Label Information

Shangmin Guo · Yi Ren · Stefano Albrecht · Kenny Smith

Keywords: [ learning dynamics ] [ Sample Relationship ] [ Neural Tangent Kernel ] [ Label Information ] [ Neural Networks ]


Abstract:

Although much research has been done on proposing new models or loss functions to improve the generalisation of artificial neural networks (ANNs), less attention has been directed to the data, which is also an important factor for training ANNs. In this work, we start from approximating the interaction between two samples, i.e. how learning one sample would modify the model's prediction on the other sample. Through analysing the terms involved in weight updates in supervised learning, we find that the signs of labels influence the interactions between samples. Therefore, we propose the labelled pseudo Neural Tangent Kernel (lpNTK) which takes label information into consideration when measuring the interactions between samples. We first prove that lpNTK would asymptotically converge to the well-known empirical Neural Tangent Kernel in terms of the Frobenius norm under certain assumptions. Secondly, we illustrate how lpNTK helps to understand learning phenomena identified in previous work, specifically the learning difficulty of samples and forgetting events during learning. Moreover, we also show that lpNTK can help to improve the generalisation performance of ANNs in image classification tasks, compared with the original whole training sets.

Chat is not available.