Timezone: »

Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations.
Sawyer Birnbaum · Volodymyr Kuleshov · Zayd Enam · Pang Wei Koh · Stefano Ermon

Thu Dec 12 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #151

Learning representations that accurately capture long-range dependencies in sequential inputs --- including text, audio, and genomic data --- is a key problem in deep learning. Feed-forward convolutional models capture only feature interactions within finite receptive fields while recurrent architectures can be slow and difficult to train due to vanishing gradients. Here, we propose Temporal Feature-Wise Linear Modulation (TFiLM) --- a novel architectural component inspired by adaptive batch normalization and its extensions --- that uses a recurrent neural network to alter the activations of a convolutional model. This approach expands the receptive field of convolutional sequence models with minimal computational overhead. Empirically, we find that TFiLM significantly improves the learning speed and accuracy of feed-forward neural networks on a range of generative and discriminative learning tasks, including text classification and audio super-resolution.

Author Information

Sawyer Birnbaum (Stanford University)
Volodymyr Kuleshov (Stanford University)
Zayd Enam (Stanford)
Pang Wei Koh (Stanford University)
Stefano Ermon (Stanford)

More from the Same Authors