[1]蒋云良,周阳,张雄涛,等.基于域间Mixup微调策略的跨被试运动想象脑电信号分类算法[J].智能系统学报,2024,19(4):909-919.[doi:10.11992/tis.202208017]
JIANG Yunliang,ZHOU Yang,ZHANG Xiongtao,et al.Cross-subject motor imagery EEG classification based on inter-domain Mixup fine-tuning strategy[J].CAAI Transactions on Intelligent Systems,2024,19(4):909-919.[doi:10.11992/tis.202208017]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
19
期数:
2024年第4期
页码:
909-919
栏目:
学术论文—机器感知与模式识别
出版日期:
2024-07-05
- Title:
-
Cross-subject motor imagery EEG classification based on inter-domain Mixup fine-tuning strategy
- 作者:
-
蒋云良1,2,3, 周阳1,2, 张雄涛1,2, 苗敏敏1,2, 张永1,2
-
1. 湖州师范学院 信息工程学院, 浙江 湖州 313000;
2. 湖州师范学院 浙江省现代农业资源智慧管理与应用研究重点实验室, 浙江 湖州 313000;
3. 浙江师范大学 计算机科学与技术学院, 浙江 金华 321000
- Author(s):
-
JIANG Yunliang1,2,3, ZHOU Yang1,2, ZHANG Xiongtao1,2, MIAO Minmin1,2, ZHANG Yong1,2
-
1. School of Information Engineering, Huzhou University, Huzhou 313000, China;
2. Zhejiang Province Key Laboratory of Smart Management & Application of Modern Agricultural Resources, Huzhou University, Huzhou 313000, China;
3. School of Computer Science and Technology, Zhejiang Normal University, Jinhua 321000, China
-
- 关键词:
-
域间Mixup; 预训练; 微调; 脑电信号; 运动想象; 跨被试知识迁移; 卷积神经网络; 正则化
- Keywords:
-
inter-domain Mixup; pre-training; fine-tuning; electroencephalogram; motor imagery; cross-subject knowledge transfer; convolutional neural network; regularization
- 分类号:
-
TP18
- DOI:
-
10.11992/tis.202208017
- 摘要:
-
为了缓解传统微调算法的灾难性遗忘问题,本文提出了一种基于域间Mixup微调策略的跨被试运动想象脑电信号分类算法Mix-Tuning。Mix-Tuning通过预训练、微调的二阶段训练方式,实现跨领域知识迁移。预训练阶段,Mix-Tuning使用源域数据初始化模型参数,挖掘源域数据潜在信息。微调阶段,Mix-Tuning通过域间Mixup,生成域间插值数据微调模型参数。域间Mixup数据增强策略引入源域数据潜在信息,缓解传统微调算法在样本稀疏场景下的灾难性遗忘问题,提高模型的泛化性能。Mix-Tuning被进一步应用于运动想象脑电信号分类任务,实现了跨被试正向知识迁移。Mix-Tuning在BMI数据集的运动想象任务达到了85.50%的平均分类准确率,相较于被试–依赖和被试–独立训练方式的预测准确率58.72%和84.01%,分别提高26.78%和1.49%。本文分析结果可为跨被试运动想象脑电信号分类算法提供参考。
- Abstract:
-
In order to alleviate the catastrophic forgetting problem of vanilla fine-tuning algorithms, we propose a cross-subject motor imagery EEG classification method based on inter-domain Mixup fine-tuning strategy, i.e., Mix-Tuning. Mix-Tuning realizes cross-domain knowledge transfer through a two-stage training manner consisting of pre-training and fine-tuning. In the pre-training stage, Mix-Tuning uses the source domain data to initialize the model parameters and mine potential information of the source domain data. In the fine-tuning stage, Mix-Tuning generates inter-domain interpolation data to fine-tune the model parameters through inter-domain Mixup. Inter-domain Mixup data enhancement strategy introduces latent information of the source domain data, which alleviates the catastrophic forgetting problem of Vanilla Fine-tuning in sparse sample scenarios and improves the generalization performance of the model. Mix-Tuning is further applied to the motor imagery EEG classification task and achieves cross-subject positive knowledge transfer. Mix-Tuning achieved an average classification accuracy of 85.50% on motor imagery task BMIdataset. Compared with 58.72% and 84.01% for Subject-specific and Subject-independent training manner, Mix-Tuning increased by 26.78% and 1.49%, respectively. The analysis results in this paper can provide a reference for cross-subject motor imagery EEG classification algorithm.
备注/Memo
收稿日期:2022-08-17。
基金项目:国家自然科学基金项目(61771193,62101189,62376094,U22A20102);浙江省教育厅科研项目(Y202146028).
作者简介:蒋云良,教授,博士生导师,博士,主要研究方向为智能信息处理、地理信息系统。享受国务院政府特殊津贴。获发明专利授权26项。发表学术论文63篇,出版学术著作2部,E-mail:jyl2022@zjnu.cn;周阳,硕士研究生。主要研究方向为深度学习、迁移学习、脑电信号处理。 E-mail:3189269614@qq.com;张雄涛,副教授,博士,主要研究方向为人工智能与模式识别、机器学习。承担国家和省部级科研课题10余项,获发明专利授权7项,发表学术论文20余篇。E-mail:02032@zjhu.edu.cn
通讯作者:张雄涛. E-mail: 02032@zjhu.edu.cn
更新日期/Last Update:
1900-01-01