Timezone: »

Real-Valued Backpropagation is Unsuitable for Complex-Valued Neural Networks
Zhi-Hao Tan · Yi Xie · Yuan Jiang · Zhi-Hua Zhou

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #603

Recently complex-valued neural networks have received increasing attention due to successful applications in various tasks and the potential advantages of better theoretical properties and richer representational capacity. However, the training dynamics of complex networks compared to real networks remains an open problem. In this paper, we investigate the dynamics of deep complex networks during real-valued backpropagation in the infinite-width limit via neural tangent kernel (NTK). We first extend the Tensor Program to the complex domain, to show that the dynamics of any basic complex network architecture is governed by its NTK under real-valued backpropagation. Then we propose a way to investigate the comparison of training dynamics between complex and real networks by studying their NTKs. As a result, we surprisingly prove that for most complex activation functions, the commonly used real-valued backpropagation reduces the training dynamics of complex networks to that of ordinary real networks as the widths tend to infinity, thus eliminating the characteristics of complex-valued neural networks. Finally, the experiments validate our theoretical findings numerically.

Author Information

Zhi-Hao Tan (Nanjing University)
Yi Xie (Nanjing University)
Yuan Jiang (National Key lab for Novel Software Technology)
Zhi-Hua Zhou (Nanjing University)

More from the Same Authors