Timezone: »
Federated Learning (FL) is a machine learning paradigm that allows decentralized clients to learn collaboratively without sharing their private data. However, excessive computation and communication demands pose challenges to current FL frameworks, especially when training large-scale models. To prevent these issues from hindering the deployment of FL systems, we propose a lightweight framework where clients jointly learn to fuse the representations generated by multiple fixed pre-trained models rather than training a large-scale model from scratch. This leads us to a more practical FL problem by considering how to capture more client-specific and class-relevant information from the pre-trained models and jointly improve each client's ability to exploit those off-the-shelf models. Here, we design a Federated Prototype-wise Contrastive Learning (FedPCL) approach which shares knowledge across clients through their class prototypes and builds client-specific representations in a prototype-wise contrastive manner. Sharing prototypes rather than learnable model parameters allows each client to fuse the representations in a personalized way while keeping the shared knowledge in a compact form for efficient communication. We perform a thorough evaluation of the proposed FedPCL in the lightweight framework, measuring and visualizing its ability to fuse various pre-trained models on popular FL datasets.
Author Information
Yue Tan (University of Technology Sydney)
Guodong Long (University of Technology Sydney (UTS))
Jie Ma (University of Technology Sydney)
LU LIU (Google)
Lu Liu is a 3-rd year Ph.D. student from University of Technology Sydney. Her research interests lie in Machine Learning, Meta-learning and Low-shot learning.
Tianyi Zhou (University of Maryland, College Park)

Tianyi Zhou (https://tianyizhou.github.io) is a tenure-track assistant professor of computer science at the University of Maryland, College Park. He received his Ph.D. from the school of computer science & engineering at the University of Washington, Seattle. His research interests are in machine learning, optimization, and natural language processing (NLP). His recent works study curriculum learning that can combine high-level human learning strategies with model training dynamics to create a hybrid intelligence. The applications include semi/self-supervised learning, robust learning, reinforcement learning, meta-learning, ensemble learning, etc. He published >80 papers and is a recipient of the Best Student Paper Award at ICDM 2013 and the 2020 IEEE Computer Society TCSC Most Influential Paper Award.
Jing Jiang (University of Technology Sydney)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: Federated Learning from Pre-Trained Models: A Contrastive Learning Approach »
Wed. Nov 30th 05:00 -- 07:00 PM Room Hall J #203
More from the Same Authors
-
2021 Spotlight: Constrained Robust Submodular Partitioning »
Shengjie Wang · Tianyi Zhou · Chandrashekhar Lavania · Jeff A Bilmes -
2022 Spotlight: Lightning Talks 3A-1 »
Shu Ding · Wanxing Chang · Jiyang Guan · Mouxiang Chen · Guan Gui · Yue Tan · Shiyun Lin · Guodong Long · Yuze Han · Wei Wang · Zhen Zhao · Ye Shi · Jian Liang · Chenghao Liu · Lei Qi · Ran He · Jie Ma · Zemin Liu · Xiang Li · Hoang Tuan · Luping Zhou · Zhihua Zhang · Jianling Sun · Jingya Wang · LU LIU · Tianyi Zhou · Lei Wang · Jing Jiang · Yinghuan Shi -
2022 Spotlight: Adversarial Auto-Augment with Label Preservation: A Representation Learning Principle Guided Approach »
Kaiwen Yang · Yanchao Sun · Jiahao Su · Fengxiang He · Xinmei Tian · Furong Huang · Tianyi Zhou · Dacheng Tao -
2022 Poster: Adversarial Auto-Augment with Label Preservation: A Representation Learning Principle Guided Approach »
Kaiwen Yang · Yanchao Sun · Jiahao Su · Fengxiang He · Xinmei Tian · Furong Huang · Tianyi Zhou · Dacheng Tao -
2022 Poster: Retrospective Adversarial Replay for Continual Learning »
Lilly Kumari · Shengjie Wang · Tianyi Zhou · Jeff A Bilmes -
2021 Poster: Constrained Robust Submodular Partitioning »
Shengjie Wang · Tianyi Zhou · Chandrashekhar Lavania · Jeff A Bilmes -
2021 Poster: Class-Disentanglement and Applications in Adversarial Detection and Defense »
Kaiwen Yang · Tianyi Zhou · Yonggang Zhang · Xinmei Tian · Dacheng Tao -
2021 Poster: CO-PILOT: COllaborative Planning and reInforcement Learning On sub-Task curriculum »
Shuang Ao · Tianyi Zhou · Guodong Long · Qinghua Lu · Liming Zhu · Jing Jiang -
2021 Poster: Recognizing Vector Graphics without Rasterization »
XINYANG JIANG · LU LIU · Caihua Shan · Yifei Shen · Xuanyi Dong · Dongsheng Li -
2020 Poster: Curriculum Learning by Dynamic Instance Hardness »
Tianyi Zhou · Shengjie Wang · Jeffrey A Bilmes -
2020 Poster: MESA: Boost Ensemble Imbalanced Learning with MEta-SAmpler »
Zhining Liu · Pengfei Wei · Jing Jiang · Wei Cao · Jiang Bian · Yi Chang -
2020 Poster: Cooperative Heterogeneous Deep Reinforcement Learning »
Han Zheng · Pengfei Wei · Jing Jiang · Guodong Long · Qinghua Lu · Chengqi Zhang -
2019 Poster: Curriculum-guided Hindsight Experience Replay »
Meng Fang · Tianyi Zhou · Yali Du · Lei Han · Zhengyou Zhang -
2019 Poster: Learning to Propagate for Graph Meta-Learning »
LU LIU · Tianyi Zhou · Guodong Long · Jing Jiang · Chengqi Zhang -
2018 Poster: Diverse Ensemble Evolution: Curriculum Data-Model Marriage »
Tianyi Zhou · Shengjie Wang · Jeffrey A Bilmes -
2014 Poster: Divide-and-Conquer Learning by Anchoring a Conical Hull »
Tianyi Zhou · Jeffrey A Bilmes · Carlos Guestrin