Timezone: »

Module-Aware Optimization for Auxiliary Learning
Hong Chen · Xin Wang · Yue Liu · Yuwei Zhou · Chaoyu Guan · Wenwu Zhu


Auxiliary learning is a widely adopted practice in deep learning, which aims to improve the model performance on the primary task by exploiting the beneficial information in the auxiliary loss. Existing auxiliary learning methods only focus on balancing the auxiliary loss and the primary loss, ignoring the module-level auxiliary influence, i.e., an auxiliary loss will be beneficial for optimizing specific modules within the model but harmful to others, failing to make full use of auxiliary information. To tackle the problem, we propose a Module-Aware Optimization approach for Auxiliary Learning (MAOAL). The proposed approach considers the module-level influence through the learnable module-level auxiliary importance, i.e., the importance of each auxiliary loss to each module. Specifically, the proposed approach jointly optimizes the module-level auxiliary importance and the model parameters in a bi-level manner. In the lower optimization, the model parameters are optimized with the importance parameterized gradient, while in the upper optimization, the module-level auxiliary importance is updated with the implicit gradient from a small developing dataset. Extensive experiments show that our proposed MAOAL method consistently outperforms state-of-the-art baselines for different auxiliary losses on various datasets, demonstrating that our method can serve as a powerful generic tool for auxiliary learning.

Author Information

Hong Chen (Tsinghua University)
Xin Wang (Tsinghua University)
Yue Liu (Tsinghua University, Tsinghua University)
Yuwei Zhou (Tsinghua University, Tsinghua University)
Chaoyu Guan (Tsinghua University, Tsinghua University)
Wenwu Zhu (Tsinghua University)

More from the Same Authors