[1]JIANG Yunliang,ZHOU Yang,ZHANG Xiongtao,et al.Cross-subject motor imagery EEG classification based on inter-domain Mixup fine-tuning strategy[J].CAAI Transactions on Intelligent Systems,2024,19(4):909-919.[doi:10.11992/tis.202208017]
Copy

Cross-subject motor imagery EEG classification based on inter-domain Mixup fine-tuning strategy

References:
[1] RAMOSER H, MULLER-GERKING J, PFURTSCHELLER G. Optimal spatial filtering of single trial EEG during imagined hand movement[J]. IEEE transactions on rehabilitation engineering, 2000, 8(4): 441–446.
[2] ANG Kaikeng, CHIN Z Y, WANG Chuanchu, et al. Filter bank common spatial pattern algorithm on BCI competition IV datasets 2a and 2b[J]. Frontiers in neuroscience, 2012, 6: 39.
[3] GAO Xiaorong, WANG Yijun, CHEN Xiaogang, et al. Interface, interaction, and intelligence in generalized brain-computer interfaces[J]. Trends in cognitive sciences, 2021, 25(8): 671–684.
[4] REZEIKA A, BENDA M, STAWICKI P, et al. Brain-computer interface spellers: a review[J]. Brain sciences, 2018, 8(4): 57.
[5] 李景聪, 潘伟健, 林镇远, 等. 采用多路图注意力网络的情绪脑电信号识别方法[J]. 智能系统学报, 2022, 17(3): 531–539
LI Jingcong, PAN Weijian, LIN Zhenyuan, et al. Emotional EEG signal recognition method using multi-path graph attention network[J]. CAAI transactions on intelligent systems, 2022, 17(3): 531–539
[6] CRAIK A, HE Yongtian, CONTRERAS-VIDAL J L. Deep learning for electroencephalogram (EEG) classification tasks: a review[J]. Journal of neural engineering, 2019, 16(3): 031001.
[7] ZHAO Liming, YAN Xu, LU Baoliang. Plug-and-play domain adaptation for cross-subject EEG-based emotion recognition[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Vancouver: AAAI, 2021, 35(1): 863-870.
[8] NI Ziyi, XU Jiaming, WU Yuwei, et al. Improving cross-state and cross-subject visual ERP-based BCI with temporal modeling and adversarial training[J]. IEEE transactions on neural systems and rehabilitation engineering, 2022, 30: 369–379.
[9] YOSINSKI J, CLUNE J, BENGIO Y, et al. How transferable are features in deep neural networks?[C]//Proceedings of the Advances in Neural Information Processing Systems. Montréal: MIT Press, 2014, 27: 3320-3328.
[10] WANG Yaqing, YAO Quanming, KWOK J T, et al. Generalizing from a few examples: a survey on few-shot learning[J]. ACM computing surveys, 53(3): 1-34.
[11] LEE M H, KWON O Y, KIM Y J, et al. EEG dataset and OpenBMI toolbox for three BCI paradigms: an investigation into BCI illiteracy[J]. GigaScience, 2019, 8(5): giz002.
[12] VAPNIK V N. An overview of statistical learning theory[J]. IEEE transactions on neural networks, 1999, 10(5): 988–999.
[13] PERKONIGG M, HOFMANNINGER J, HEROLD C J, et al. Dynamic memory to alleviate catastrophic forgetting in continual learning with medical imaging[J]. Nature communications, 2021, 12: 5678.
[14] ZHANG Baosheng, GUO Yuchen, LI Yipeng, et al. Memory recall: a simple neural network training framework against catastrophic forgetting[J]. IEEE transactions on neural networks and learning systems, 2022, 33(5): 2010–2022.
[15] HOULSBY N, GIURGIU A, JASTRZEBSKI S, et al. Parameter-efficient transfer learning for NLP[C]//Proceedings of the 36th International Conference on Machine Learning. Long Beach: IMLS, 2019, 97: 2790-2799.
[16] LI Xuhong, GRANDVALET Y, DAVOINE F. Explicit inductive bias for transfer learning with convolutional networks[C]//Proceedings of the 35th International Conference on Machine Learning. Stockholmsm?ssan: IMLS, 2018, 80: 2825-2834.
[17] CHEN Sanyuan, HOU Yutai, CUI Yiming, et al. Recall and learn: fine-tuning deep pretrained language models with less forgetting[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2020: 7870-7881.
[18] LEE C, CHO K, KANG Wanmo. Mixout: effective regularization to finetune large-scale pretrained language models[C]//Proceeding of the 8th International Conference on Learning Representations. Addis Ababa: ICLR. 2020: 1-17.
[19] XU Runxin, LUO Fuli, ZHANG Zhiyuan, et al. Raise a child in large language model: towards effective and generalizable fine-tuning[C]//Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2021: 9514-9528.
[20] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J]. Journal of machine learning research, 2014, 15: 1929–1958.
[21] CHAPELLE O, WESTON J, BOTTOU L, et al. Vicinal risk minimization[C]//Proceedings of the Advances in Neural Information Processing Systems. Denver: MIT Press, 2000, 13: 1-7.
[22] ZHANG H Y, CISSE M, DAUPHIN Y N, et al. Mixup: beyond empirical risk minimization[C]//Proceedings of the International Conference on Learning Representations. Vancouver: ICLR, 2018: 1-13.
[23] LAWHERN V J, SOLON A J, WAYTOWICH N R, et al. EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces[J]. Journal of neural engineering, 2018, 15(5): 056013.
[24] LI Yang, GUO Lianghui, LIU Yu, et al. A temporal-spectral-based squeeze-and-excitation feature fusion network for motor imagery EEG decoding[J]. IEEE transactions on neural systems and rehabilitation engineering, 2021, 29: 1534–1545.
[25] CAI Siqi, SU Enze, XIE Longhan, et al. EEG-based auditory attention detection via frequency and channel neural attention[J]. IEEE transactions on human-machine systems, 2021, 52(2): 256–266.
[26] SCHIRRMEISTER R T, SPRINGENBERG J T, FIEDERER L D J, et al. Deep learning with convolutional neural networks for EEG decoding and visualization[J]. Human brain mapping, 2017, 38(11): 5391–5420.
[27] KOSTAS D, RUDZICZ F. Thinker invariance: enabling deep neural networks for BCI across more people[J]. Journal of neural engineering, 2020, 17(5): 056008.
[28] KWON O Y, LEE M H, GUAN Cuntai, et al. Subject-independent brain-computer interfaces based on deep convolutional neural networks[J]. IEEE transactions on neural networks and learning systems, 2019, 31(10): 3839–3852.
[29] ZHANG Kaishuo, ROBINSON N, LEE S W, et al. Adaptive transfer learning for EEG motor imagery classification with deep convolutional neural network[J]. Neural networks, 2021, 136: 1–10.
[30] SZEGEDY C, VANHOUCKE V, IOFFE S, et al. Rethinking the inception architecture for computer vision[C]//2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 2818-2826.
[31] HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C]//2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 770-778.
[32] HUANG Gao, LIU Zhuang, VAN DER MAATEN L, et al. Densely connected convolutional networks[C]//2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 2261-2269.
[33] GOODFELLOW I, BENGIO Y, COURVILLE A. Deep learning[M]. Cambridge: MIT press, 2016.
[34] 章杭奎, 刘栋军, 孔万增. 面向跨被试RSVP的多特征低维子空间嵌入的ERP检测[J]. 智能系统学报, 2022, 17(5): 1054–1061
ZHANG Hangkui, LIU Dongjun, KONG Wanzeng. ERP detection of multi-feature embedding in the low-dimensional subspace for cross-subject RSVP[J]. CAAI transactions on intelligent systems, 2022, 17(5): 1054–1061
[35] KINGMA D P, BA J L. Adam: a method for stochastic optimization[C]//3rd International Conference on Learning Representations, ICLR 2015 - Conference Track Proceedings. San Diego: ICLR, 2015: 1-15.
Similar References:

Memo

-

Last Update: 1900-01-01

Copyright © CAAI Transactions on Intelligent Systems