Workshop: Shared Visual Representations in Human and Machine Intelligence (SVRHM)
Generalized Predictive Coding: Bayesian Inference in Static and Dynamic models
André Ofner · Beren Millidge · Sebastian Stober
Predictive coding networks (PCNs) have an inherent degree of biological plausibility and perform approximate backpropagation of error in supervised settings. It is less clear how predictive coding compares to state-of-the-art architectures, such as VAEs in unsupervised and probabilistic settings. We propose a generalized PCN that, like its inspiration in neuroscience, parameterizes hierarchical latent distributions under the Laplace approximation and maximises model evidence via iterative inference using local precision weighted error signals. Unlike its inspiration it uses multi-layer networks with nonlinearities between latent distributions. We compare our model to VAE and VLAE baselines on three different image datasets and find that generalized predictive coding shows performance comparable to variational autoencoders with exact error backpropagation. Finally, we evaluate the possibility of learning temporal dynamics via static prediction by encoding sequences of states in generalized coordinates of motion.