TY - GEN
T1 - Empowering Federated Learning
T2 - 31st Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2023
AU - Khan, Mashal
AU - Glavin, Frank G.
AU - Nickles, Matthias
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Federated Learning is an emerging approach to Machine Learning which allows for decentralised model training which safeguards privacy. Its potential applications, particularly in Medicine, Smart Manufacturing, Finance, and the Internet of Things, hold significant promise. However, it faces hurdles due to resource constraints and the diverse nature of data and devices at the client end. This paper highlights the critical challenge of client drift and its effects on Machine Learning model performance across various architectural configurations. Furthermore, our findings reveal that the use of pretrained models such as ResNet offers a compelling solution to mitigate the impact of client drift to some extent. Nonetheless, it is worth noting that leveraging pretrained models necessitates substantial client-side resources. In response to the dual challenges of client drift and resource constraints, we propose an innovative approach involving Knowledge Distillation, namely combining distillation loss and classification loss while using knowledge distillation at the client. Here, the teacher model is trained on a more compact dataset, while the student model undertakes training on a larger, more diverse dataset. This approach not only improves robustness but also enhances privacy. The outcomes of our experiments substantiate the efficacy of this technique, showcasing an approximate improvement of 50% in the accuracy and loss of the student model.
AB - Federated Learning is an emerging approach to Machine Learning which allows for decentralised model training which safeguards privacy. Its potential applications, particularly in Medicine, Smart Manufacturing, Finance, and the Internet of Things, hold significant promise. However, it faces hurdles due to resource constraints and the diverse nature of data and devices at the client end. This paper highlights the critical challenge of client drift and its effects on Machine Learning model performance across various architectural configurations. Furthermore, our findings reveal that the use of pretrained models such as ResNet offers a compelling solution to mitigate the impact of client drift to some extent. Nonetheless, it is worth noting that leveraging pretrained models necessitates substantial client-side resources. In response to the dual challenges of client drift and resource constraints, we propose an innovative approach involving Knowledge Distillation, namely combining distillation loss and classification loss while using knowledge distillation at the client. Here, the teacher model is trained on a more compact dataset, while the student model undertakes training on a larger, more diverse dataset. This approach not only improves robustness but also enhances privacy. The outcomes of our experiments substantiate the efficacy of this technique, showcasing an approximate improvement of 50% in the accuracy and loss of the student model.
KW - Client Drift
KW - Federated Learning
KW - Knowledge Distillation
KW - Machine Learning
UR - http://www.scopus.com/inward/record.url?scp=85189943332&partnerID=8YFLogxK
U2 - 10.1109/AICS60730.2023.10470505
DO - 10.1109/AICS60730.2023.10470505
M3 - Conference Publication
AN - SCOPUS:85189943332
T3 - 2023 31st Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2023
BT - 2023 31st Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 7 December 2023 through 8 December 2023
ER -