Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Fri Dec 09 11:00 PM -- 09:30 AM (PST) @ Room 113
Neural Abstract Machines & Program Induction
Matko Bošnjak · Nando de Freitas · Tejas Kulkarni · Arvind Neelakantan · Scott E Reed · Sebastian Riedel · Tim Rocktäschel
[ Video





Workshop Home Page

Machine intelligence capable of learning complex procedural behavior, inducing (latent) programs, and reasoning with these programs is a key to solving artificial intelligence. The problems of learning procedural behavior and program induction have been studied from different perspectives in many computer science fields such as program synthesis, probabilistic programming, inductive logic programming, reinforcement learning, and recently in deep learning. However, despite the common goal, there seems to be little communication and collaboration between the different fields focused on this problem.

Recently, there have been a lot of success stories in the deep learning community related to learning neural networks capable of using trainable memory abstractions. This has led to the development of neural networks with differentiable data structures such as Neural Turing Machines, Memory Networks, Neural Stacks, and Hierarchical Attentive Memory, among others. Simultaneously, neural program induction models like Neural Program Interpreters and Neural Programmer have created a lot of excitement in the field, promising induction of algorithmic behavior, and enabling inclusion of programming languages in the processes of execution and induction, while staying end-to-end trainable. Trainable program induction models have the potential to make a substantial impact in many problems involving long-term memory, reasoning, and procedural execution, such as question answering, dialog, and robotics.

The aim of the NAMPI workshop is to bring researchers and practitioners from both academia and industry, in the areas of deep learning, program synthesis, probabilistic programming, inductive programming and reinforcement learning, together to exchange ideas on the future of program induction with a special focus on neural network models and abstract machines. Through this workshop we look to identify common challenges, exchange ideas among and lessons learned from the different fields, as well as establish a (set of) standard evaluation benchmark(s) for approaches that learn with abstraction and/or reason with induced programs.

Areas of interest for discussion and submissions include, but are not limited to (in alphabetical order):
- Applications
- Compositionality in Representation Learning
- Differentiable Memory
- Differentiable Data Structures
- Function and (sub-)Program Compositionality
- Inductive Logic Programming
- Knowledge Representation in Neural Abstract Structures
- Large-scale Program Induction
- Meta-Learning and Self-improving
- Neural Abstract Machines
- Program Induction: Datasets, Tasks, and Evaluation
- Program Synthesis
- Probabilistic Programming
- Reinforcement Learning for Program Induction
- Semantic Parsing

Introduction
Stephen Muggleton - What use is Abstraction in Deep Program Induction? (Session)
Daniel Tarlow - In Search of Strong Generalization: Building Structured Models in the Age of Neural Networks (Session)
Charles Sutton - Learning Program Representation: Symbols to Semantics (Session)
Coffee Break (Break)
Doina Precup - From temporal abstraction to programs (Session)
Rob Fergus - Learning to Compose by Delegation (Session)
Percy Liang - How Can We Write Large Programs without Thinking? (Session)
Lunch (Break)
Martin Vechev - Program Synthesis and Machine Learning (Session)
Ed Grefenstette - Limitations of RNNs: a computational perspective (Session)
Coffee Break & Poster Session (Break & Poster session)
Jürgen Schmidhuber - Learning how to Learn Learning Algorithms: Recursive Self-Improvement (Session)
Joshua Tenenbaum & Kevin Ellis - Bayesian program learning: Prospects for building more human-like AI systems (Session)
Alex Graves - Learning When to Halt With Adaptive Computation Time (Session)
Debate with Percy Liang, Jürgen Schmidhuber, Joshua Tenenbaum and Martin Vechev (Discussion Panel)
Closing word