Studying novel research in personalized federated learning by discovering effective parameters for fine-tuning and federated averaging at training time, surpassing existing FL baselines
The Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities Workshop at ICML 2023 was recently held Honolulu, HI on July 23-July29, 2023.
recent advancements in federated learning (FL) which seeks to increase client-level performance by fine-tuning client parameters on local data or personalizing architectures for the local task. Existing methods for such personalization either prune a global model or fine-tune a global model on a local client distribution. However, these existing methods either personalize at the expense of retaining important global knowledge, or predetermine network layers for fine-tuning, resulting in sub-optimal storage of global knowledge within client models. Enlightened by the lottery ticket hypothesis, this team first introduced a hypothesis for finding optimal client sub-networks to locally fine-tune while leaving the rest of the parameters frozen. They then proposed a novel FL framework, FedSelect, using this procedure that directly personalizes both client sub-network structure and parameters, via the simultaneous discovery of optimal parameters for personalization and the rest of parameters for global aggregation during training. This team showed that this method achieves promising results on CIFAR-10.
In recognition for his research, Rishub Tamirisa was awarded the
FL-ICML Early Career Scholarship.
Also of note:
- The team received the
Cohere For AI Research Grant
- They were the only undergraduates presenting at the workshop
The group would sincerely like to thank NCSA, Amazon, Microsoft, POSTECH, and ICML for covering their travel, board and event registration costs.
For more information on their research, read the full paper HERE.
To read about the students’ impression of the event and see their photographs click HERE for a recap.