Timezone: »

Transfer Learning with Neural AutoML
Catherine Wong · Neil Houlsby · Yifeng Lu · Andrea Gesmundo

Thu Dec 06 07:45 AM -- 09:45 AM (PST) @ Room 517 AB #158

We reduce the computational cost of Neural AutoML with transfer learning. AutoML relieves human effort by automating the design of ML algorithms. Neural AutoML has become popular for the design of deep learning architectures, however, this method has a high computation cost. To address this we propose Transfer Neural AutoML that uses knowledge from prior tasks to speed up network design. We extend RL-based architecture search methods to support parallel training on multiple tasks and then transfer the search strategy to new tasks. On language and image classification data, Transfer Neural AutoML reduces convergence time over single-task training by over an order of magnitude on many tasks.

Author Information

Catherine Wong (MIT)
Neil Houlsby (Google)
Yifeng Lu
Andrea Gesmundo (Google)

More from the Same Authors