Please login first
Secure and Adaptive Federated Learning with Knowledge Distillation and Hierarchical Homomorphic Encryption for Non-IID Data
* ,
1  Polytechnic School, Universidade de São Paulo (USP), São Paulo, 05508-010, Brazil
Academic Editor: Eugenio Vocaturo

Abstract:

Federated Learning (FL) offers a promising paradigm for privacy-preserving, collaborative machine learning; however, the presence of non-independent and identically distributed (non-IID) data among clients significantly affects global model performance. This research proposes a novel architecture that combines Knowledge Distillation (KD) with Vision Transformer (ViT) models and hierarchical fully homomorphic encryption (FHE) to address both the non-IID data challenge and privacy preservation in FL. The proposed framework employs an aggregator server to homomorphically aggregate encrypted local model parameters, which are then decrypted and averaged by a separate federated server, ensuring that only clients retain access to their unencrypted parameters. Traditional approaches that modify aggregation algorithms are computationally prohibitive or incompatible with FHE; in contrast, KD facilitates robust model adaptation to local client distributions, supports heterogeneous client architectures, and integrates seamlessly with encrypted workflows. Experimental results with two clients, one utilizing the CIFAR-10 dataset and another utilizing the Pascal VOC 2007 dataset (sharing common classes), demonstrate the efficacy of the approach. EfficientNet was used for local training with Pyfhel-based FHE applied to model parameter exchange. Without knowledge distillation, the system obtained an AUC of 0.78, which improved to 0.84 when applying ViT-based knowledge distillation. The findings highlight the proposed method's potential to enhance FL robustness, adaptability, and privacy, representing a viable and scalable solution for privacy-preserving collaborative learning in heterogeneous environments.

Keywords: federated learning; privacy; secure; knowledge distillation; vision transformers; fully homomorphic encryption; non-IID
Comments on this paper
Currently there are no comments available.


 
 
Top