[1]张铭泉,贾圆圆,张荣华.数字孪生中混合知识蒸馏辅助的异构联邦类增量学习[J].智能系统学报,2025,20(4):905-915.[doi:10.11992/tis.202406027]
ZHANG Mingquan,JIA Yuanyuan,ZHANG Ronghua.Hybrid knowledge distillation-assisted heterogeneous federated class incremental learning for digital twins[J].CAAI Transactions on Intelligent Systems,2025,20(4):905-915.[doi:10.11992/tis.202406027]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
20
期数:
2025年第4期
页码:
905-915
栏目:
学术论文—机器学习
出版日期:
2025-08-05
- Title:
-
Hybrid knowledge distillation-assisted heterogeneous federated class incremental learning for digital twins
- 作者:
-
张铭泉1,2, 贾圆圆1, 张荣华1,3
-
1. 华北电力大学 计算机系, 河北 保定 071003;
2. 华北电力大学 河北省能源电力知识计算重点实验室, 河北 保定 071003;
3. 华北电力大学 复杂能源系统智能计算教育部工程研究中心, 河北 保定 071003
- Author(s):
-
ZHANG Mingquan1,2, JIA Yuanyuan1, ZHANG Ronghua1,3
-
1. Department of Computer, North China Electric Power University, Baoding 071003, China;
2. Hebei Key Laboratory of Knowledge Computing for Energy & Power, North China Electric Power University, Baoding 071003, China;
3. Engineering Research Center of Intelligent Computing for Complex Energy Systems, Ministry of Education, North China Electric Power University, Baoding 071003, China
-
- 关键词:
-
数字孪生; 联邦类增量学习; 混合知识蒸馏; 数据异构; 图像分类; 灾难性遗忘; CT图像; 联邦学习
- Keywords:
-
digital twin; federated class incremental learning; hybrid knowledge distillation; data heterogeneity; image classification; catastrophic forgetting; CT images; federated learning
- 分类号:
-
TP399
- DOI:
-
10.11992/tis.202406027
- 文献标志码:
-
2025-2-24
- 摘要:
-
在数字孪生背景下,联邦学习面临数据非独立同分布和类别动态变化的挑战,即空间和时间范围内的数据异构问题。为解决这一问题,本文构建了一个数字孪生背景下的联邦类增量学习整体框架,并提出了一种混合知识蒸馏辅助的联邦类增量学习方法。具体来说,与传统联邦学习本地更新方式不同,本文方法通过自适应语义蒸馏损失和自适应注意力蒸馏损失集成的混合知识蒸馏方法提取旧全局模型中输出层的软标签语义知识和中间层的高维特征知识,使客户端模型在拟合新数据的同时有效减少对旧数据的遗忘,提升联邦类增量模型的性能。在相同的数据异构情况下,与对比模型相比,本文方法在CIFAR100数据集上精度提升1.85%~2.56%,在医学CT图像数据集OrganAMNIST、OrganCMNIST、OrganSMNIST上也取得了最优或次优的性能。
- Abstract:
-
In the context of digital twins, federated learning faces the challenge of identically nonindependent distribution data and dynamic changes of classes, which can be explained as the problem of data heterogeneity in the spatial and temporal scales. To solve this problem, this paper constructs an overall framework for federated class incremental learning for digital twins and proposes a federated class incremental learning method called hybrid knowledge distillation-assisted heterogeneous federated class incremental learning (FedKA). Specifically, different from the traditional federated learning approaches, FedKA employs a hybrid knowledge distillation method during the local update period. This method integrates adaptive semantic distillation loss with adaptive attention distillation loss. FedKA can distill the soft-labeled semantic knowledge in the output layer and the high-dimensional feature knowledge in the middle layer of the old global model. Consequently, the client model can effectively reduce the forgetfulness of the old data while fitting the new data and improve the performance of the federated class incremental model. Under the same data heterogeneity, the proposed FedKA method is utilized, and the accuracy on the CIFAR100 dataset remarkably increases from 1.85% to 2.56% compared with the SOTA model. Furthermore, FedKA achieves optimal or near-optimal performance on the medical CT image datasets, including OrganAMNIST, OrganCMNIST, and OrganSMNIST.
更新日期/Last Update:
1900-01-01