Skip to yearly menu bar Skip to main content


Poster

Latent Attention For If-Then Program Synthesis

Chang Liu · Xinyun Chen · Richard Shin · Mingcheng Chen · Dawn Song

Area 5+6+7+8 #50

Keywords: [ Deep Learning or Neural Networks ] [ Multi-task and Transfer Learning ] [ (Application) Natural Language and Text Processing ] [ (Cognitive/Neuroscience) Language ]


Abstract:

Automatic translation from natural language descriptions into programs is a long-standing challenging problem. In this work, we consider a simple yet important sub-problem: translation from textual descriptions to If-Then programs. We devise a novel neural network architecture for this task which we train end-to-end. Specifically, we introduce Latent Attention, which computes multiplicative weights for the words in the description in a two-stage process with the goal of better leveraging the natural language structures that indicate the relevant parts for predicting program elements. Our architecture reduces the error rate by 28.57% compared to prior art. We also propose a one-shot learning scenario of If-Then program synthesis and simulate it with our existing dataset. We demonstrate a variation on the training procedure for this scenario that outperforms the original procedure, significantly closing the gap to the model trained with all data.

Chat is not available.